sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
sequencelengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
sequencelengths 0
25
| languages
sequencelengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
sequencelengths 0
352
| processed_texts
sequencelengths 1
353
|
---|---|---|---|---|---|---|---|---|---|---|---|---|
be017e637bef0ad642af65b011ebfa977401d3b9 |
# Dataset of t_cms/T-CMS/T-CMS (Girls' Frontline)
This is the dataset of t_cms/T-CMS/T-CMS (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `grey_hair, long_hair, multicolored_hair, streaked_hair, bangs, hair_between_eyes, breasts, purple_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 36.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 14.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 39 | 32.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 28.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 39 | 56.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_cms_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t_cms_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, jacket, fur_trim, goggles_around_neck, coat, off_shoulder, bare_shoulders, black_gloves, black_shorts, open_clothes, holding, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | jacket | fur_trim | goggles_around_neck | coat | off_shoulder | bare_shoulders | black_gloves | black_shorts | open_clothes | holding | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------|:-----------|:----------------------|:-------|:---------------|:-----------------|:---------------|:---------------|:---------------|:----------|:--------------------|
| 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/t_cms_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:25:07+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of t\_cms/T-CMS/T-CMS (Girls' Frontline)
================================================
This is the dataset of t\_cms/T-CMS/T-CMS (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are 'grey\_hair, long\_hair, multicolored\_hair, streaked\_hair, bangs, hair\_between\_eyes, breasts, purple\_eyes, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
149cf722efdbcecb5f749ec889e42e47ec67a0f1 |
# Dataset of ks_23/KS-23/KS-23 (Girls' Frontline)
This is the dataset of ks_23/KS-23/KS-23 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `breasts, orange_hair, large_breasts, yellow_eyes, ahoge, long_hair, red_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 16.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 21.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 14.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 28.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ks_23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ks_23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, fingerless_gloves, sharp_teeth, solo, cleavage, navel, simple_background, blush, midriff, white_background, shorts, black_gloves, elbow_gloves, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | fingerless_gloves | sharp_teeth | solo | cleavage | navel | simple_background | blush | midriff | white_background | shorts | black_gloves | elbow_gloves | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:--------------|:-------|:-----------|:--------|:--------------------|:--------|:----------|:-------------------|:---------|:---------------|:---------------|:--------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ks_23_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:20+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:24:40+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ks\_23/KS-23/KS-23 (Girls' Frontline)
================================================
This is the dataset of ks\_23/KS-23/KS-23 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are 'breasts, orange\_hair, large\_breasts, yellow\_eyes, ahoge, long\_hair, red\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fc784922ff63c0df62657f38f3c11a33cda8a7e7 |
# Dataset of scar_l/SCAR-L (Girls' Frontline)
This is the dataset of scar_l/SCAR-L (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `bangs, blue_eyes, long_hair, blonde_hair, hat, hair_ornament, hairclip, black_headwear, brown_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 28.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 15.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 47 | 34.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 25.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 47 | 49.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_l_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scar_l_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, closed_mouth, simple_background, jacket, white_background, white_shirt, blush, holding, scarf, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | closed_mouth | simple_background | jacket | white_background | white_shirt | blush | holding | scarf | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:--------------------|:---------|:-------------------|:--------------|:--------|:----------|:--------|:-------------|
| 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scar_l_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:25:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of scar\_l/SCAR-L (Girls' Frontline)
============================================
This is the dataset of scar\_l/SCAR-L (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are 'bangs, blue\_eyes, long\_hair, blonde\_hair, hat, hair\_ornament, hairclip, black\_headwear, brown\_hair, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
403af1f194ca204e8224a463d7c3afad799dd343 |
# Dataset of scar_h/SCAR-H (Girls' Frontline)
This is the dataset of scar_h/SCAR-H (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `bangs, long_hair, blonde_hair, blue_eyes, hat, ponytail, white_headwear, baseball_cap, breasts, brown_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 25.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 30.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 22.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 43.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scar_h_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scar_h_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blue_gloves, 1girl, solo, assault_rifle, black_jacket, feet_out_of_frame, holding_gun, looking_at_viewer, white_background, long_sleeves, midriff, navel, pants, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blue_gloves | 1girl | solo | assault_rifle | black_jacket | feet_out_of_frame | holding_gun | looking_at_viewer | white_background | long_sleeves | midriff | navel | pants | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------|:--------|:-------|:----------------|:---------------|:--------------------|:--------------|:--------------------|:-------------------|:---------------|:----------|:--------|:--------|:--------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scar_h_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:27:54+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of scar\_h/SCAR-H (Girls' Frontline)
============================================
This is the dataset of scar\_h/SCAR-H (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are 'bangs, long\_hair, blonde\_hair, blue\_eyes, hat, ponytail, white\_headwear, baseball\_cap, breasts, brown\_hair, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
fc7ed37478073bfd1f9844a12b6a83128430a672 |
# Dataset of dp28/DP28/DP28 (Girls' Frontline)
This is the dataset of dp28/DP28/DP28 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are `blonde_hair, long_hair, blue_eyes, breasts, large_breasts, hat, braid, fur_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 41.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 19.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 66 | 45.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 34.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 66 | 72.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/dp28_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/dp28_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, white_gloves, cleavage, solo, belt, blush, thighhighs, looking_at_viewer, black_panties, simple_background, white_background, side-tie_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | white_gloves | cleavage | solo | belt | blush | thighhighs | looking_at_viewer | black_panties | simple_background | white_background | side-tie_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------|:-------|:-------|:--------|:-------------|:--------------------|:----------------|:--------------------|:-------------------|:-------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/dp28_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:21:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:29:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of dp28/DP28/DP28 (Girls' Frontline)
============================================
This is the dataset of dp28/DP28/DP28 (Girls' Frontline), containing 26 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, blue\_eyes, breasts, large\_breasts, hat, braid, fur\_hat, white\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6b4efa3089a1c5fa137a0fa4d6017a2c4bbda83c |
# Dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane)
This is the dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `long_hair, green_eyes, brown_hair, breasts, ponytail, large_breasts, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 20 | 13.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 11.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 20 | 19.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pennsylvania_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pennsylvania_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, pantyhose, simple_background, white_background, black_gloves, cleavage, looking_at_viewer, blush, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pantyhose | simple_background | white_background | black_gloves | cleavage | looking_at_viewer | blush | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:-------------------|:---------------|:-----------|:--------------------|:--------|:----------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pennsylvania_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:24:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:29:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane)
=================================================
This is the dataset of pennsylvania/ペンシルベニア/宾夕法尼亚 (Azur Lane), containing 10 images and their tags.
The core tags of this character are 'long\_hair, green\_eyes, brown\_hair, breasts, ponytail, large\_breasts, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
3286e64ca04e49406361ceb2b25503cae956fdcd |
# Dataset of kuroshio/黒潮/黑潮 (Azur Lane)
This is the dataset of kuroshio/黒潮/黑潮 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `braid, horns, red_eyes, hair_flower, hair_ornament, long_hair, twin_braids, bangs, pointy_ears, black_hair, bow, red_bow, hair_bow, sidelocks, red_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 11.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 9.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 16.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuroshio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuroshio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, black_scarf, pleated_skirt, red_thighhighs, bare_shoulders, black_skirt, obi, white_background, bridal_gauntlets, elbow_gloves, panties, simple_background, garter_straps, weapon, blush, closed_mouth, floral_print, full_body, kimono, pink_flower, shoes, white_footwear |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | black_scarf | pleated_skirt | red_thighhighs | bare_shoulders | black_skirt | obi | white_background | bridal_gauntlets | elbow_gloves | panties | simple_background | garter_straps | weapon | blush | closed_mouth | floral_print | full_body | kimono | pink_flower | shoes | white_footwear |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------|:----------------|:-----------------|:-----------------|:--------------|:------|:-------------------|:-------------------|:---------------|:----------|:--------------------|:----------------|:---------|:--------|:---------------|:---------------|:------------|:---------|:--------------|:--------|:-----------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kuroshio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:25:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:28:38+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kuroshio/黒潮/黑潮 (Azur Lane)
=====================================
This is the dataset of kuroshio/黒潮/黑潮 (Azur Lane), containing 10 images and their tags.
The core tags of this character are 'braid, horns, red\_eyes, hair\_flower, hair\_ornament, long\_hair, twin\_braids, bangs, pointy\_ears, black\_hair, bow, red\_bow, hair\_bow, sidelocks, red\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2764d196dd416beaa342814cbb7e3641e05f5b54 |
# Dataset of chicago/シカゴ/芝加哥 (Azur Lane)
This is the dataset of chicago/シカゴ/芝加哥 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `breasts, drill_hair, blonde_hair, ahoge, blue_eyes, large_breasts, twin_drills, hair_between_eyes, long_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 21.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 14.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 19.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 38.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chicago_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chicago_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, smile, blush, cleavage, bare_shoulders, looking_at_viewer, navel, solo, black_choker, red_gloves, star_print, collarbone, midriff, elbow_gloves, criss-cross_halter, short_shorts, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | smile | blush | cleavage | bare_shoulders | looking_at_viewer | navel | solo | black_choker | red_gloves | star_print | collarbone | midriff | elbow_gloves | criss-cross_halter | short_shorts | sitting |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:-----------|:-----------------|:--------------------|:--------|:-------|:---------------|:-------------|:-------------|:-------------|:----------|:---------------|:---------------------|:---------------|:----------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chicago_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:25:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:30:36+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of chicago/シカゴ/芝加哥 (Azur Lane)
======================================
This is the dataset of chicago/シカゴ/芝加哥 (Azur Lane), containing 21 images and their tags.
The core tags of this character are 'breasts, drill\_hair, blonde\_hair, ahoge, blue\_eyes, large\_breasts, twin\_drills, hair\_between\_eyes, long\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a2cc15f2341877215be896763625976350e32dff |
# Dataset Card for Evaluation run of Pierre-obi/Mistral_solar-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Pierre-obi/Mistral_solar-slerp](https://huggingface.co/Pierre-obi/Mistral_solar-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:33:11.418111](https://huggingface.co/datasets/open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp/blob/main/results_2024-01-13T23-33-11.418111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.40347501414405273,
"acc_stderr": 0.03383375290012146,
"acc_norm": 0.40822900373379084,
"acc_norm_stderr": 0.03472416283155831,
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.46956525596934184,
"mc2_stderr": 0.015501210721813442
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436174,
"acc_norm": 0.4300341296928328,
"acc_norm_stderr": 0.014467631559137994
},
"harness|hellaswag|10": {
"acc": 0.4433379804819757,
"acc_stderr": 0.004957637648426472,
"acc_norm": 0.5792670782712607,
"acc_norm_stderr": 0.004926678108601339
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3881578947368421,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.3881578947368421,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4226415094339623,
"acc_stderr": 0.030402331445769537,
"acc_norm": 0.4226415094339623,
"acc_norm_stderr": 0.030402331445769537
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.0240268463928735,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.0240268463928735
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2064516129032258,
"acc_stderr": 0.023025899617188726,
"acc_norm": 0.2064516129032258,
"acc_norm_stderr": 0.023025899617188726
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.0356796977226805,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.0356796977226805
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.46464646464646464,
"acc_stderr": 0.03553436368828063,
"acc_norm": 0.46464646464646464,
"acc_norm_stderr": 0.03553436368828063
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6476683937823834,
"acc_stderr": 0.03447478286414357,
"acc_norm": 0.6476683937823834,
"acc_norm_stderr": 0.03447478286414357
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000756,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000756
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236153,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236153
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.43119266055045874,
"acc_stderr": 0.021233365030319563,
"acc_norm": 0.43119266055045874,
"acc_norm_stderr": 0.021233365030319563
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.31862745098039214,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.31862745098039214,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.459915611814346,
"acc_stderr": 0.03244246810187914,
"acc_norm": 0.459915611814346,
"acc_norm_stderr": 0.03244246810187914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5381165919282511,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.5381165919282511,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.04384140024078016,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.04384140024078016
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536823,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536823
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4233128834355828,
"acc_stderr": 0.038818912133343826,
"acc_norm": 0.4233128834355828,
"acc_norm_stderr": 0.038818912133343826
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674064,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674064
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.51213282247765,
"acc_stderr": 0.017874698667491338,
"acc_norm": 0.51213282247765,
"acc_norm_stderr": 0.017874698667491338
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.026680134761679214,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.026680134761679214
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331146,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331146
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4919614147909968,
"acc_stderr": 0.028394421370984545,
"acc_norm": 0.4919614147909968,
"acc_norm_stderr": 0.028394421370984545
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39197530864197533,
"acc_stderr": 0.027163686038271233,
"acc_norm": 0.39197530864197533,
"acc_norm_stderr": 0.027163686038271233
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3120567375886525,
"acc_stderr": 0.02764012054516993,
"acc_norm": 0.3120567375886525,
"acc_norm_stderr": 0.02764012054516993
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2966101694915254,
"acc_stderr": 0.011665946586082854,
"acc_norm": 0.2966101694915254,
"acc_norm_stderr": 0.011665946586082854
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541104,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541104
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3839869281045752,
"acc_stderr": 0.01967580813528152,
"acc_norm": 0.3839869281045752,
"acc_norm_stderr": 0.01967580813528152
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3795918367346939,
"acc_stderr": 0.031067211262872495,
"acc_norm": 0.3795918367346939,
"acc_norm_stderr": 0.031067211262872495
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.03265819588512699,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.03265819588512699
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.038284011150790206,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.038284011150790206
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49707602339181284,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.49707602339181284,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2876376988984088,
"mc1_stderr": 0.015846315101394802,
"mc2": 0.46956525596934184,
"mc2_stderr": 0.015501210721813442
},
"harness|winogrande|5": {
"acc": 0.6819258089976322,
"acc_stderr": 0.013089285079884678
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604777
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp | [
"region:us"
] | 2024-01-13T23:35:29+00:00 | {"pretty_name": "Evaluation run of Pierre-obi/Mistral_solar-slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Pierre-obi/Mistral_solar-slerp](https://huggingface.co/Pierre-obi/Mistral_solar-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:33:11.418111](https://huggingface.co/datasets/open-llm-leaderboard/details_Pierre-obi__Mistral_solar-slerp/blob/main/results_2024-01-13T23-33-11.418111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.40347501414405273,\n \"acc_stderr\": 0.03383375290012146,\n \"acc_norm\": 0.40822900373379084,\n \"acc_norm_stderr\": 0.03472416283155831,\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.46956525596934184,\n \"mc2_stderr\": 0.015501210721813442\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436174,\n \"acc_norm\": 0.4300341296928328,\n \"acc_norm_stderr\": 0.014467631559137994\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4433379804819757,\n \"acc_stderr\": 0.004957637648426472,\n \"acc_norm\": 0.5792670782712607,\n \"acc_norm_stderr\": 0.004926678108601339\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3881578947368421,\n \"acc_stderr\": 0.03965842097512744,\n \"acc_norm\": 0.3881578947368421,\n \"acc_norm_stderr\": 0.03965842097512744\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4226415094339623,\n \"acc_stderr\": 0.030402331445769537,\n \"acc_norm\": 0.4226415094339623,\n \"acc_norm_stderr\": 0.030402331445769537\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3201058201058201,\n \"acc_stderr\": 0.0240268463928735,\n \"acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.0240268463928735\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2064516129032258,\n \"acc_stderr\": 0.023025899617188726,\n \"acc_norm\": 0.2064516129032258,\n \"acc_norm_stderr\": 0.023025899617188726\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.0356796977226805,\n \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.0356796977226805\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.46464646464646464,\n \"acc_stderr\": 0.03553436368828063,\n \"acc_norm\": 0.46464646464646464,\n \"acc_norm_stderr\": 0.03553436368828063\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6476683937823834,\n \"acc_stderr\": 0.03447478286414357,\n \"acc_norm\": 0.6476683937823834,\n \"acc_norm_stderr\": 0.03447478286414357\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000756,\n \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000756\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236153,\n \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236153\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.43119266055045874,\n \"acc_stderr\": 0.021233365030319563,\n \"acc_norm\": 0.43119266055045874,\n \"acc_norm_stderr\": 0.021233365030319563\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.31862745098039214,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\": 0.31862745098039214,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.459915611814346,\n \"acc_stderr\": 0.03244246810187914,\n \"acc_norm\": 0.459915611814346,\n \"acc_norm_stderr\": 0.03244246810187914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5381165919282511,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.5381165919282511,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.04384140024078016,\n \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.04384140024078016\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.04812917324536823,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.04812917324536823\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4233128834355828,\n \"acc_stderr\": 0.038818912133343826,\n \"acc_norm\": 0.4233128834355828,\n \"acc_norm_stderr\": 0.038818912133343826\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n \"acc_stderr\": 0.029745048572674064,\n \"acc_norm\": 0.7094017094017094,\n \"acc_norm_stderr\": 0.029745048572674064\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.51213282247765,\n \"acc_stderr\": 0.017874698667491338,\n \"acc_norm\": 0.51213282247765,\n \"acc_norm_stderr\": 0.017874698667491338\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5664739884393064,\n \"acc_stderr\": 0.026680134761679214,\n \"acc_norm\": 0.5664739884393064,\n \"acc_norm_stderr\": 0.026680134761679214\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331146,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331146\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4150326797385621,\n \"acc_stderr\": 0.028213504177824093,\n \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.028213504177824093\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4919614147909968,\n \"acc_stderr\": 0.028394421370984545,\n \"acc_norm\": 0.4919614147909968,\n \"acc_norm_stderr\": 0.028394421370984545\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.39197530864197533,\n \"acc_stderr\": 0.027163686038271233,\n \"acc_norm\": 0.39197530864197533,\n \"acc_norm_stderr\": 0.027163686038271233\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3120567375886525,\n \"acc_stderr\": 0.02764012054516993,\n \"acc_norm\": 0.3120567375886525,\n \"acc_norm_stderr\": 0.02764012054516993\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2966101694915254,\n \"acc_stderr\": 0.011665946586082854,\n \"acc_norm\": 0.2966101694915254,\n \"acc_norm_stderr\": 0.011665946586082854\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541104,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541104\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.3839869281045752,\n \"acc_stderr\": 0.01967580813528152,\n \"acc_norm\": 0.3839869281045752,\n \"acc_norm_stderr\": 0.01967580813528152\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3795918367346939,\n \"acc_stderr\": 0.031067211262872495,\n \"acc_norm\": 0.3795918367346939,\n \"acc_norm_stderr\": 0.031067211262872495\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n \"acc_stderr\": 0.03265819588512699,\n \"acc_norm\": 0.30845771144278605,\n \"acc_norm_stderr\": 0.03265819588512699\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.038284011150790206,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.038284011150790206\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.49707602339181284,\n \"acc_stderr\": 0.03834759370936839,\n \"acc_norm\": 0.49707602339181284,\n \"acc_norm_stderr\": 0.03834759370936839\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2876376988984088,\n \"mc1_stderr\": 0.015846315101394802,\n \"mc2\": 0.46956525596934184,\n \"mc2_stderr\": 0.015501210721813442\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6819258089976322,\n \"acc_stderr\": 0.013089285079884678\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.0021386703014604777\n }\n}\n```", "repo_url": "https://huggingface.co/Pierre-obi/Mistral_solar-slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["**/details_harness|winogrande|5_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-33-11.418111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_33_11.418111", "path": ["results_2024-01-13T23-33-11.418111.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-33-11.418111.parquet"]}]}]} | 2024-01-13T23:35:50+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Pierre-obi/Mistral_solar-slerp
Dataset automatically created during the evaluation run of model Pierre-obi/Mistral_solar-slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T23:33:11.418111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Pierre-obi/Mistral_solar-slerp\n\n\n\nDataset automatically created during the evaluation run of model Pierre-obi/Mistral_solar-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T23:33:11.418111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Pierre-obi/Mistral_solar-slerp\n\n\n\nDataset automatically created during the evaluation run of model Pierre-obi/Mistral_solar-slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T23:33:11.418111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
756a8e32ec189e4df67038c92aef537b00d428f3 |
A further augmented and modified version of [Augmental-Dataset](https://huggingface.co/datasets/Heralax/Augmental-Dataset) for Steins;Gate-themed RP in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), scenario setup, character introductions.
- All split conversations were joined.
- The assistant always plays only a single character (chosen to be the character with the maximum number of lines who is not the first speaker). All other characters are assigned to the user. This is described precisely in the first prompt.
- Conversations alternate between user and assistant, with the first prompt always being from the user, and the last always being from the assistant.
| grimulkan/Augmental-Stenisgate-Augmented | [
"license:unknown",
"region:us"
] | 2024-01-13T23:37:29+00:00 | {"license": "unknown"} | 2024-01-13T23:45:27+00:00 | [] | [] | TAGS
#license-unknown #region-us
|
A further augmented and modified version of Augmental-Dataset for Steins;Gate-themed RP in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content), scenario setup, character introductions.
- All split conversations were joined.
- The assistant always plays only a single character (chosen to be the character with the maximum number of lines who is not the first speaker). All other characters are assigned to the user. This is described precisely in the first prompt.
- Conversations alternate between user and assistant, with the first prompt always being from the user, and the last always being from the assistant.
| [] | [
"TAGS\n#license-unknown #region-us \n"
] |
643aefcdb871e216eea3bc827ac7b6e00e4d2f79 |
# Dataset of scw/SCW/SCW (Girls' Frontline)
This is the dataset of scw/SCW/SCW (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are `blonde_hair, short_hair, red_eyes, headphones, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 16.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 11.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 16.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 25.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/scw_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/scw_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, jacket, gloves, looking_at_viewer, smile, assault_rifle, armband, boots, holding_gun, single_thighhigh, socks, uneven_legwear, bag, eagle, full_body, headset, red_scarf |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jacket | gloves | looking_at_viewer | smile | assault_rifle | armband | boots | holding_gun | single_thighhigh | socks | uneven_legwear | bag | eagle | full_body | headset | red_scarf |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:---------|:--------------------|:--------|:----------------|:----------|:--------|:--------------|:-------------------|:--------|:-----------------|:------|:--------|:------------|:----------|:------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/scw_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:46:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of scw/SCW/SCW (Girls' Frontline)
=========================================
This is the dataset of scw/SCW/SCW (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are 'blonde\_hair, short\_hair, red\_eyes, headphones, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
c70a746a039c4cdf15a9f7b8bdc1ce3ce295d7e8 |
# Dataset of m9/M9/M9 (Girls' Frontline)
This is the dataset of m9/M9/M9 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blonde_hair, long_hair, red_eyes, hairband, fang, very_long_hair, breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 10.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 15.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 10.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 19.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m9_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m9_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, looking_at_viewer, open_mouth, detached_sleeves, handgun, bare_shoulders, blush, red_dress, black_pantyhose |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | open_mouth | detached_sleeves | handgun | bare_shoulders | blush | red_dress | black_pantyhose |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------------|:-------------------|:----------|:-----------------|:--------|:------------|:------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m9_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:46:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m9/M9/M9 (Girls' Frontline)
======================================
This is the dataset of m9/M9/M9 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, red\_eyes, hairband, fang, very\_long\_hair, breasts, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7326c3933956173c6c0c150d218665a77b33d9a4 |
# Dataset of m500/M500/M500 (Girls' Frontline)
This is the dataset of m500/M500/M500 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are `animal_ears, blonde_hair, long_hair, blue_eyes, breasts, goggles_on_head, large_breasts, tail, bangs, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 29.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 19.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 36.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 48.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m500_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m500_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, open_mouth, shirt, cleavage, holding, goggles, shorts, blush, gloves, looking_at_viewer |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | open_mouth | shirt | cleavage | holding | goggles | shorts | blush | gloves | looking_at_viewer |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-------------|:--------|:-----------|:----------|:----------|:---------|:--------|:---------|:--------------------|
| 0 | 30 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m500_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:30+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:49:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m500/M500/M500 (Girls' Frontline)
============================================
This is the dataset of m500/M500/M500 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are 'animal\_ears, blonde\_hair, long\_hair, blue\_eyes, breasts, goggles\_on\_head, large\_breasts, tail, bangs, fang', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
95707712060be389f0decc608b14b900517cc8a5 |
# Dataset of mg3/MG3/MG3 (Girls' Frontline)
This is the dataset of mg3/MG3/MG3 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, large_breasts, long_hair, braid, single_braid, bangs, hair_between_eyes, hair_ornament, hairclip, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 19.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 20.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 17.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 28.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mg3_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mg3_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, black_pantyhose, sweater, boots, cleavage, full_body, gun, necklace, off_shoulder |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | black_pantyhose | sweater | boots | cleavage | full_body | gun | necklace | off_shoulder |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:------------------|:----------|:--------|:-----------|:------------|:------|:-----------|:---------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mg3_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:35+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:47:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mg3/MG3/MG3 (Girls' Frontline)
=========================================
This is the dataset of mg3/MG3/MG3 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, breasts, large\_breasts, long\_hair, braid, single\_braid, bangs, hair\_between\_eyes, hair\_ornament, hairclip, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a48227d0965e7b85af25500f23122959c70ed0bc |
# Dataset of galil/ガリル/加利尔 (Girls' Frontline)
This is the dataset of galil/ガリル/加利尔 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `long_hair, ahoge, brown_hair, brown_eyes, blonde_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 9.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 12.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 8.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 15.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galil_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/galil_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, simple_background, skirt, white_background, assault_rifle, holding_weapon, jacket, military_uniform, necklace, pantyhose, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | simple_background | skirt | white_background | assault_rifle | holding_weapon | jacket | military_uniform | necklace | pantyhose | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------------------|:--------|:-------------------|:----------------|:-----------------|:---------|:-------------------|:-----------|:------------|:--------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/galil_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:43:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:47:04+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of galil/ガリル/加利尔 (Girls' Frontline)
===========================================
This is the dataset of galil/ガリル/加利尔 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are 'long\_hair, ahoge, brown\_hair, brown\_eyes, blonde\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9496bbcd0832d60c1882ba7d03ab8772f15e85dd |
# Dataset of leonardo_da_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane)
This is the dataset of leonardo_da_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane), containing 56 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, breasts, bangs, goggles_on_head, small_breasts, orange_eyes, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 56 | 77.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 56 | 43.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 139 | 90.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 56 | 67.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 127.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/leonardo_da_vinci_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/leonardo_da_vinci_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 32 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, goggles, looking_at_viewer, smile, bare_shoulders, navel, blush, thighhighs, simple_background, off_shoulder, zipper_pull_tab, white_background, white_coat, open_coat, open_mouth, thigh_strap, thighs, highleg_swimsuit, long_sleeves |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, earrings, hair_flower, looking_at_viewer, navel, tiara, wings, bare_shoulders, medium_breasts, smile, ballerina, collarbone, red_dress, tutu, ballet_slippers, white_pantyhose, closed_mouth, full_body, red_footwear, red_rose, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | goggles | looking_at_viewer | smile | bare_shoulders | navel | blush | thighhighs | simple_background | off_shoulder | zipper_pull_tab | white_background | white_coat | open_coat | open_mouth | thigh_strap | thighs | highleg_swimsuit | long_sleeves | earrings | hair_flower | tiara | wings | medium_breasts | ballerina | collarbone | red_dress | tutu | ballet_slippers | white_pantyhose | closed_mouth | full_body | red_footwear | red_rose | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:--------------------|:--------|:-----------------|:--------|:--------|:-------------|:--------------------|:---------------|:------------------|:-------------------|:-------------|:------------|:-------------|:--------------|:---------|:-------------------|:---------------|:-----------|:--------------|:--------|:--------|:-----------------|:------------|:-------------|:------------|:-------|:------------------|:------------------|:---------------|:------------|:---------------|:-----------|:-----------|
| 0 | 32 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/leonardo_da_vinci_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:36+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:59:57+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of leonardo\_da\_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane)
=================================================================
This is the dataset of leonardo\_da\_vinci/レオナルド・ダ・ヴィンチ/莱昂纳多·达·芬奇 (Azur Lane), containing 56 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, twintails, breasts, bangs, goggles\_on\_head, small\_breasts, orange\_eyes, red\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a172360f32cb20c1086a1fe4e46681c70e128f40 |
# Dataset of attilio_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane)
This is the dataset of attilio_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane), containing 29 images and their tags.
The core tags of this character are `long_hair, purple_eyes, bangs, ahoge, twintails, blonde_hair, very_long_hair, hair_between_eyes, bow, ribbon, hair_ornament, breasts, fang, symbol-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 29 | 46.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 29 | 22.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 65 | 47.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 29 | 38.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 65 | 75.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/attilio_regolo_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/attilio_regolo_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, bare_shoulders, blush, looking_at_viewer, open_mouth, long_sleeves, dress, heart, collarbone, underwear, detached_sleeves, :d, sitting, halterneck |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | blush | looking_at_viewer | open_mouth | long_sleeves | dress | heart | collarbone | underwear | detached_sleeves | :d | sitting | halterneck |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------|:--------------------|:-------------|:---------------|:--------|:--------|:-------------|:------------|:-------------------|:-----|:----------|:-------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/attilio_regolo_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:37+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:54:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of attilio\_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane)
==========================================================
This is the dataset of attilio\_regolo/アッティリオ・レゴロ/阿蒂利奥·雷戈洛 (Azur Lane), containing 29 images and their tags.
The core tags of this character are 'long\_hair, purple\_eyes, bangs, ahoge, twintails, blonde\_hair, very\_long\_hair, hair\_between\_eyes, bow, ribbon, hair\_ornament, breasts, fang, symbol-shaped\_pupils', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7f42c08dba1c7d814efc36356d14e41d500e3c9f |
# Dataset of kagero/陽炎/阳炎 (Azur Lane)
This is the dataset of kagero/陽炎/阳炎 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `animal_ears, brown_hair, purple_eyes, twintails, bangs, fang, fox_ears, rabbit_ears, short_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 9.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 7.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 21 | 12.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 9.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 21 | 14.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kagero_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kagero_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | looking_at_viewer, 1girl, solo, bare_shoulders, blush, detached_sleeves, open_mouth, wide_sleeves, collarbone, simple_background, :d, full_body, long_sleeves, machinery, turret, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | bare_shoulders | blush | detached_sleeves | open_mouth | wide_sleeves | collarbone | simple_background | :d | full_body | long_sleeves | machinery | turret | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:-----------------|:--------|:-------------------|:-------------|:---------------|:-------------|:--------------------|:-----|:------------|:---------------|:------------|:---------|:-------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kagero_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:38+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:51:08+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kagero/陽炎/阳炎 (Azur Lane)
===================================
This is the dataset of kagero/陽炎/阳炎 (Azur Lane), containing 13 images and their tags.
The core tags of this character are 'animal\_ears, brown\_hair, purple\_eyes, twintails, bangs, fang, fox\_ears, rabbit\_ears, short\_hair, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
aed85d45e538761d404495924484b01f22373b70 |
# Dataset of flandre/フランドル/弗兰德尔 (Azur Lane)
This is the dataset of flandre/フランドル/弗兰德尔 (Azur Lane), containing 42 images and their tags.
The core tags of this character are `long_hair, bangs, white_hair, twintails, purple_eyes, breasts, hat, small_breasts, bow, ribbon, grey_eyes, low_twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 83.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 42 | 37.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 110 | 87.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 42 | 68.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 110 | 139.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/flandre_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/flandre_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, black_thighhighs, garter_straps, long_sleeves, looking_at_viewer, solo, white_leotard, blush, grey_hair, thighs, closed_mouth, hair_ornament, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_thighhighs | garter_straps | long_sleeves | looking_at_viewer | solo | white_leotard | blush | grey_hair | thighs | closed_mouth | hair_ornament | smile | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------------|:----------------|:---------------|:--------------------|:-------|:----------------|:--------|:------------|:---------|:---------------|:----------------|:--------|:-------------------|
| 0 | 6 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/flandre_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:47:40+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T00:00:57+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of flandre/フランドル/弗兰德尔 (Azur Lane)
=========================================
This is the dataset of flandre/フランドル/弗兰德尔 (Azur Lane), containing 42 images and their tags.
The core tags of this character are 'long\_hair, bangs, white\_hair, twintails, purple\_eyes, breasts, hat, small\_breasts, bow, ribbon, grey\_eyes, low\_twintails', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2e1ac4c00e6eec9e66f727725de5c35b8c057cec |
# Dataset of michishio/満潮/满潮 (Azur Lane)
This is the dataset of michishio/満潮/满潮 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `animal_ears, cat_ears, bangs, animal_ear_fluff, breasts, brown_hair, long_hair, ahoge, brown_eyes, braid, cat_girl, large_breasts, hair_between_eyes, cat_tail, medium_breasts, ribbon, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 24.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 17.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 51 | 34.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 23.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 51 | 44.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/michishio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/michishio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, balloon, detached_sleeves, looking_at_viewer, open_mouth, solo, :d, pink_dress, puffy_short_sleeves, frills, full_body, high_heels, pink_footwear, white_background, white_thighhighs, bare_shoulders, blush, bow, cleavage_cutout, hair_rings, jingle_bell, petals, simple_background, standing_on_one_leg, tiara, very_long_hair, virtual_youtuber |
| 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | blush, jingle_bell, :d, neck_bell, open_mouth, kimono, long_sleeves, looking_at_viewer, red_skirt, 2girls, bare_shoulders, wide_sleeves, off_shoulder, pleated_skirt, white_shirt, sailor_collar, simple_background, white_background, holding, red_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | balloon | detached_sleeves | looking_at_viewer | open_mouth | solo | :d | pink_dress | puffy_short_sleeves | frills | full_body | high_heels | pink_footwear | white_background | white_thighhighs | bare_shoulders | blush | bow | cleavage_cutout | hair_rings | jingle_bell | petals | simple_background | standing_on_one_leg | tiara | very_long_hair | virtual_youtuber | neck_bell | kimono | long_sleeves | red_skirt | 2girls | wide_sleeves | off_shoulder | pleated_skirt | white_shirt | sailor_collar | holding | red_ribbon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:-------------------|:--------------------|:-------------|:-------|:-----|:-------------|:----------------------|:---------|:------------|:-------------|:----------------|:-------------------|:-------------------|:-----------------|:--------|:------|:------------------|:-------------|:--------------|:---------|:--------------------|:----------------------|:--------|:-----------------|:-------------------|:------------|:---------|:---------------|:------------|:---------|:---------------|:---------------|:----------------|:--------------|:----------------|:----------|:-------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 15 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | | | | X | X | | X | | | | | | | X | | X | X | | | | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/michishio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-13T23:48:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-13T23:54:25+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of michishio/満潮/满潮 (Azur Lane)
======================================
This is the dataset of michishio/満潮/满潮 (Azur Lane), containing 23 images and their tags.
The core tags of this character are 'animal\_ears, cat\_ears, bangs, animal\_ear\_fluff, breasts, brown\_hair, long\_hair, ahoge, brown\_eyes, braid, cat\_girl, large\_breasts, hair\_between\_eyes, cat\_tail, medium\_breasts, ribbon, tail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
8bf59d39e039531315f674d8fd5d245a13014d59 |
# Dataset Card for Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kquant03/Ryu-4x7B-MoE-bf16](https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T23:51:35.789085](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16/blob/main/results_2024-01-13T23-51-35.789085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6396004158808674,
"acc_stderr": 0.032332778374865194,
"acc_norm": 0.6426234407115231,
"acc_norm_stderr": 0.03298221583193354,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649568492897451,
"mc2_stderr": 0.015609242157624164
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620196,
"acc_norm": 0.6646757679180887,
"acc_norm_stderr": 0.01379618294778556
},
"harness|hellaswag|10": {
"acc": 0.6634136626170085,
"acc_stderr": 0.004715762925037027,
"acc_norm": 0.831009759012149,
"acc_norm_stderr": 0.0037397742854185247
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124484,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124484
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545843,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545843
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524006,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524006
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.026311858071854155,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.026311858071854155
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.019162418588623557,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.019162418588623557
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.649568492897451,
"mc2_stderr": 0.015609242157624164
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386798
},
"harness|gsm8k|5": {
"acc": 0.4973464746019712,
"acc_stderr": 0.01377229076885817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16 | [
"region:us"
] | 2024-01-13T23:53:53+00:00 | {"pretty_name": "Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kquant03/Ryu-4x7B-MoE-bf16](https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-13T23:51:35.789085](https://huggingface.co/datasets/open-llm-leaderboard/details_Kquant03__Ryu-4x7B-MoE-bf16/blob/main/results_2024-01-13T23-51-35.789085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6396004158808674,\n \"acc_stderr\": 0.032332778374865194,\n \"acc_norm\": 0.6426234407115231,\n \"acc_norm_stderr\": 0.03298221583193354,\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649568492897451,\n \"mc2_stderr\": 0.015609242157624164\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620196,\n \"acc_norm\": 0.6646757679180887,\n \"acc_norm_stderr\": 0.01379618294778556\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6634136626170085,\n \"acc_stderr\": 0.004715762925037027,\n \"acc_norm\": 0.831009759012149,\n \"acc_norm_stderr\": 0.0037397742854185247\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124484,\n \"acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124484\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n \"acc_stderr\": 0.030500283176545843,\n \"acc_norm\": 0.7085201793721974,\n \"acc_norm_stderr\": 0.030500283176545843\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.026311858071854155,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.026311858071854155\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.019162418588623557,\n \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.019162418588623557\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.649568492897451,\n \"mc2_stderr\": 0.015609242157624164\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386798\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4973464746019712,\n \"acc_stderr\": 0.01377229076885817\n }\n}\n```", "repo_url": "https://huggingface.co/Kquant03/Ryu-4x7B-MoE-bf16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["**/details_harness|winogrande|5_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-13T23-51-35.789085.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_13T23_51_35.789085", "path": ["results_2024-01-13T23-51-35.789085.parquet"]}, {"split": "latest", "path": ["results_2024-01-13T23-51-35.789085.parquet"]}]}]} | 2024-01-13T23:54:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16
Dataset automatically created during the evaluation run of model Kquant03/Ryu-4x7B-MoE-bf16 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-13T23:51:35.789085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Ryu-4x7B-MoE-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T23:51:35.789085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kquant03/Ryu-4x7B-MoE-bf16\n\n\n\nDataset automatically created during the evaluation run of model Kquant03/Ryu-4x7B-MoE-bf16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-13T23:51:35.789085(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
acfebe75504019b17d16c1152255faa14c1c5ecb |
# Dataset Card for Evaluation run of jefferylovely/AiMaven-Orca2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/AiMaven-Orca2](https://huggingface.co/jefferylovely/AiMaven-Orca2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T00:32:07.397103](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2/blob/main/results_2024-01-14T00-32-07.397103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.54465465733523,
"acc_stderr": 0.034129622181532,
"acc_norm": 0.5502915050766849,
"acc_norm_stderr": 0.03486859931303051,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5343298654242948,
"mc2_stderr": 0.01618337374565952
},
"harness|arc:challenge|25": {
"acc": 0.5187713310580204,
"acc_stderr": 0.014601090150633964,
"acc_norm": 0.5469283276450512,
"acc_norm_stderr": 0.014546892052005628
},
"harness|hellaswag|10": {
"acc": 0.6054570802628958,
"acc_stderr": 0.004877534215987091,
"acc_norm": 0.789982075283808,
"acc_norm_stderr": 0.0040648854960034396
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.039255233810529325,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.039255233810529325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955784,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955784
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5763888888888888,
"acc_stderr": 0.0413212501972337,
"acc_norm": 0.5763888888888888,
"acc_norm_stderr": 0.0413212501972337
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699947,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699947
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6387096774193548,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.6387096774193548,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6616161616161617,
"acc_stderr": 0.03371124142626303,
"acc_norm": 0.6616161616161617,
"acc_norm_stderr": 0.03371124142626303
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7616580310880829,
"acc_stderr": 0.030748905363909895,
"acc_norm": 0.7616580310880829,
"acc_norm_stderr": 0.030748905363909895
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5384615384615384,
"acc_stderr": 0.025275892070240637,
"acc_norm": 0.5384615384615384,
"acc_norm_stderr": 0.025275892070240637
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5630252100840336,
"acc_stderr": 0.032219436365661956,
"acc_norm": 0.5630252100840336,
"acc_norm_stderr": 0.032219436365661956
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.726605504587156,
"acc_stderr": 0.01910929984609829,
"acc_norm": 0.726605504587156,
"acc_norm_stderr": 0.01910929984609829
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196694,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196694
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7381864623243933,
"acc_stderr": 0.015720838678445266,
"acc_norm": 0.7381864623243933,
"acc_norm_stderr": 0.015720838678445266
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574904,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574904
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6109324758842444,
"acc_stderr": 0.027690337536485372,
"acc_norm": 0.6109324758842444,
"acc_norm_stderr": 0.027690337536485372
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138015,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138015
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557309,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557309
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38070404172099087,
"acc_stderr": 0.012401430654645893,
"acc_norm": 0.38070404172099087,
"acc_norm_stderr": 0.012401430654645893
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5073529411764706,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.5073529411764706,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5343137254901961,
"acc_stderr": 0.020180144843307293,
"acc_norm": 0.5343137254901961,
"acc_norm_stderr": 0.020180144843307293
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6571428571428571,
"acc_stderr": 0.030387262919547728,
"acc_norm": 0.6571428571428571,
"acc_norm_stderr": 0.030387262919547728
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5472636815920398,
"acc_stderr": 0.03519702717576915,
"acc_norm": 0.5472636815920398,
"acc_norm_stderr": 0.03519702717576915
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.038913644958358196,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.038913644958358196
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209204,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209204
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088365,
"mc2": 0.5343298654242948,
"mc2_stderr": 0.01618337374565952
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.012273648008759987
},
"harness|gsm8k|5": {
"acc": 0.2259287338893101,
"acc_stderr": 0.011519098777279958
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2 | [
"region:us"
] | 2024-01-14T00:34:23+00:00 | {"pretty_name": "Evaluation run of jefferylovely/AiMaven-Orca2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jefferylovely/AiMaven-Orca2](https://huggingface.co/jefferylovely/AiMaven-Orca2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T00:32:07.397103](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AiMaven-Orca2/blob/main/results_2024-01-14T00-32-07.397103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.54465465733523,\n \"acc_stderr\": 0.034129622181532,\n \"acc_norm\": 0.5502915050766849,\n \"acc_norm_stderr\": 0.03486859931303051,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5343298654242948,\n \"mc2_stderr\": 0.01618337374565952\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5187713310580204,\n \"acc_stderr\": 0.014601090150633964,\n \"acc_norm\": 0.5469283276450512,\n \"acc_norm_stderr\": 0.014546892052005628\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6054570802628958,\n \"acc_stderr\": 0.004877534215987091,\n \"acc_norm\": 0.789982075283808,\n \"acc_norm_stderr\": 0.0040648854960034396\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.039255233810529325,\n \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.039255233810529325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955784,\n \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955784\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5763888888888888,\n \"acc_stderr\": 0.0413212501972337,\n \"acc_norm\": 0.5763888888888888,\n \"acc_norm_stderr\": 0.0413212501972337\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562429,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562429\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699947,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699947\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6387096774193548,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.6387096774193548,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6616161616161617,\n \"acc_stderr\": 0.03371124142626303,\n \"acc_norm\": 0.6616161616161617,\n \"acc_norm_stderr\": 0.03371124142626303\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7616580310880829,\n \"acc_stderr\": 0.030748905363909895,\n \"acc_norm\": 0.7616580310880829,\n \"acc_norm_stderr\": 0.030748905363909895\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5384615384615384,\n \"acc_stderr\": 0.025275892070240637,\n \"acc_norm\": 0.5384615384615384,\n \"acc_norm_stderr\": 0.025275892070240637\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5630252100840336,\n \"acc_stderr\": 0.032219436365661956,\n \"acc_norm\": 0.5630252100840336,\n \"acc_norm_stderr\": 0.032219436365661956\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.726605504587156,\n \"acc_stderr\": 0.01910929984609829,\n \"acc_norm\": 0.726605504587156,\n \"acc_norm_stderr\": 0.01910929984609829\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693268,\n \"acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693268\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196694,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196694\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7381864623243933,\n \"acc_stderr\": 0.015720838678445266,\n \"acc_norm\": 0.7381864623243933,\n \"acc_norm_stderr\": 0.015720838678445266\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574904,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574904\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6109324758842444,\n \"acc_stderr\": 0.027690337536485372,\n \"acc_norm\": 0.6109324758842444,\n \"acc_norm_stderr\": 0.027690337536485372\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138015,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138015\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557309,\n \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557309\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38070404172099087,\n \"acc_stderr\": 0.012401430654645893,\n \"acc_norm\": 0.38070404172099087,\n \"acc_norm_stderr\": 0.012401430654645893\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5073529411764706,\n \"acc_stderr\": 0.030369552523902173,\n \"acc_norm\": 0.5073529411764706,\n \"acc_norm_stderr\": 0.030369552523902173\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5343137254901961,\n \"acc_stderr\": 0.020180144843307293,\n \"acc_norm\": 0.5343137254901961,\n \"acc_norm_stderr\": 0.020180144843307293\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6571428571428571,\n \"acc_stderr\": 0.030387262919547728,\n \"acc_norm\": 0.6571428571428571,\n \"acc_norm_stderr\": 0.030387262919547728\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5472636815920398,\n \"acc_stderr\": 0.03519702717576915,\n \"acc_norm\": 0.5472636815920398,\n \"acc_norm_stderr\": 0.03519702717576915\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n \"acc_stderr\": 0.038913644958358196,\n \"acc_norm\": 0.4879518072289157,\n \"acc_norm_stderr\": 0.038913644958358196\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209204,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209204\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088365,\n \"mc2\": 0.5343298654242948,\n \"mc2_stderr\": 0.01618337374565952\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.012273648008759987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2259287338893101,\n \"acc_stderr\": 0.011519098777279958\n }\n}\n```", "repo_url": "https://huggingface.co/jefferylovely/AiMaven-Orca2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["**/details_harness|winogrande|5_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T00-32-07.397103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T00_32_07.397103", "path": ["results_2024-01-14T00-32-07.397103.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T00-32-07.397103.parquet"]}]}]} | 2024-01-14T00:34:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jefferylovely/AiMaven-Orca2
Dataset automatically created during the evaluation run of model jefferylovely/AiMaven-Orca2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T00:32:07.397103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jefferylovely/AiMaven-Orca2\n\n\n\nDataset automatically created during the evaluation run of model jefferylovely/AiMaven-Orca2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T00:32:07.397103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jefferylovely/AiMaven-Orca2\n\n\n\nDataset automatically created during the evaluation run of model jefferylovely/AiMaven-Orca2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T00:32:07.397103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5ed688ed6e1be468a2c8aa53ae5394a91f408e96 |
# Dataset Card for Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ibndias/Nous-Hermes-2-MoE-2x34B](https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T00:41:59.190674](https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B/blob/main/results_2024-01-14T00-41-59.190674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.761185340730833,
"acc_stderr": 0.02810264232361143,
"acc_norm": 0.7648166441855127,
"acc_norm_stderr": 0.02863812731410329,
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5808164969122677,
"mc2_stderr": 0.014977589951125109
},
"harness|arc:challenge|25": {
"acc": 0.6424914675767918,
"acc_stderr": 0.014005494275916573,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176534
},
"harness|hellaswag|10": {
"acc": 0.6606253734315873,
"acc_stderr": 0.004725293905228259,
"acc_norm": 0.8572993427604063,
"acc_norm_stderr": 0.003490524965061915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.024270227737522715,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.024270227737522715
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7962264150943397,
"acc_stderr": 0.024790784501775402,
"acc_norm": 0.7962264150943397,
"acc_norm_stderr": 0.024790784501775402
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.025545239210256917,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.025545239210256917
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.049665709039785295,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.049665709039785295
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7787234042553192,
"acc_stderr": 0.027136349602424056,
"acc_norm": 0.7787234042553192,
"acc_norm_stderr": 0.027136349602424056
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7862068965517242,
"acc_stderr": 0.034165204477475494,
"acc_norm": 0.7862068965517242,
"acc_norm_stderr": 0.034165204477475494
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6746031746031746,
"acc_stderr": 0.024130158299762613,
"acc_norm": 0.6746031746031746,
"acc_norm_stderr": 0.024130158299762613
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8903225806451613,
"acc_stderr": 0.01777677870048519,
"acc_norm": 0.8903225806451613,
"acc_norm_stderr": 0.01777677870048519
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.625615763546798,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.625615763546798,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8727272727272727,
"acc_stderr": 0.026024657651656187,
"acc_norm": 0.8727272727272727,
"acc_norm_stderr": 0.026024657651656187
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.02239078763821677,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.02239078763821677
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9792746113989638,
"acc_stderr": 0.010281417011909042,
"acc_norm": 0.9792746113989638,
"acc_norm_stderr": 0.010281417011909042
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246787,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246787
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4703703703703704,
"acc_stderr": 0.030431963547936584,
"acc_norm": 0.4703703703703704,
"acc_norm_stderr": 0.030431963547936584
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.023005459446673936,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.023005459446673936
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248437,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248437
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9247706422018349,
"acc_stderr": 0.011308662537571762,
"acc_norm": 0.9247706422018349,
"acc_norm_stderr": 0.011308662537571762
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.919831223628692,
"acc_stderr": 0.01767667999189163,
"acc_norm": 0.919831223628692,
"acc_norm_stderr": 0.01767667999189163
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7892376681614349,
"acc_stderr": 0.02737309550054019,
"acc_norm": 0.7892376681614349,
"acc_norm_stderr": 0.02737309550054019
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342344,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342344
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9173553719008265,
"acc_stderr": 0.025135382356604227,
"acc_norm": 0.9173553719008265,
"acc_norm_stderr": 0.025135382356604227
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.029239272675632748,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.029239272675632748
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.026321383198783674,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.026321383198783674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.9223300970873787,
"acc_stderr": 0.02650144078476276,
"acc_norm": 0.9223300970873787,
"acc_norm_stderr": 0.02650144078476276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.01653462768431136,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.01653462768431136
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.01046101533819307,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.01046101533819307
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135026,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6346368715083799,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.6346368715083799,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02027940293617458,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02027940293617458
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.02135534302826405,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.02135534302826405
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.01748643278588071,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.01748643278588071
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.028267657482650154,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.028267657482650154
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.605606258148631,
"acc_stderr": 0.012482141665631176,
"acc_norm": 0.605606258148631,
"acc_norm_stderr": 0.012482141665631176
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.015422512066262554,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.015422512066262554
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8367346938775511,
"acc_stderr": 0.023661699177098608,
"acc_norm": 0.8367346938775511,
"acc_norm_stderr": 0.023661699177098608
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015578,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015578
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41982864137086906,
"mc1_stderr": 0.01727703030177577,
"mc2": 0.5808164969122677,
"mc2_stderr": 0.014977589951125109
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781103
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B | [
"region:us"
] | 2024-01-14T00:44:12+00:00 | {"pretty_name": "Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [ibndias/Nous-Hermes-2-MoE-2x34B](https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T00:41:59.190674](https://huggingface.co/datasets/open-llm-leaderboard/details_ibndias__Nous-Hermes-2-MoE-2x34B/blob/main/results_2024-01-14T00-41-59.190674.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.761185340730833,\n \"acc_stderr\": 0.02810264232361143,\n \"acc_norm\": 0.7648166441855127,\n \"acc_norm_stderr\": 0.02863812731410329,\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5808164969122677,\n \"mc2_stderr\": 0.014977589951125109\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6424914675767918,\n \"acc_stderr\": 0.014005494275916573,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176534\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6606253734315873,\n \"acc_stderr\": 0.004725293905228259,\n \"acc_norm\": 0.8572993427604063,\n \"acc_norm_stderr\": 0.003490524965061915\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.024270227737522715,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.024270227737522715\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7962264150943397,\n \"acc_stderr\": 0.024790784501775402,\n \"acc_norm\": 0.7962264150943397,\n \"acc_norm_stderr\": 0.024790784501775402\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n \"acc_stderr\": 0.025545239210256917,\n \"acc_norm\": 0.8958333333333334,\n \"acc_norm_stderr\": 0.025545239210256917\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.049665709039785295,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.049665709039785295\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7787234042553192,\n \"acc_stderr\": 0.027136349602424056,\n \"acc_norm\": 0.7787234042553192,\n \"acc_norm_stderr\": 0.027136349602424056\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7862068965517242,\n \"acc_stderr\": 0.034165204477475494,\n \"acc_norm\": 0.7862068965517242,\n \"acc_norm_stderr\": 0.034165204477475494\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6746031746031746,\n \"acc_stderr\": 0.024130158299762613,\n \"acc_norm\": 0.6746031746031746,\n \"acc_norm_stderr\": 0.024130158299762613\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8903225806451613,\n \"acc_stderr\": 0.01777677870048519,\n \"acc_norm\": 0.8903225806451613,\n \"acc_norm_stderr\": 0.01777677870048519\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.625615763546798,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.625615763546798,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8727272727272727,\n \"acc_stderr\": 0.026024657651656187,\n \"acc_norm\": 0.8727272727272727,\n \"acc_norm_stderr\": 0.026024657651656187\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.02239078763821677,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.02239078763821677\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9792746113989638,\n \"acc_stderr\": 0.010281417011909042,\n \"acc_norm\": 0.9792746113989638,\n \"acc_norm_stderr\": 0.010281417011909042\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246787,\n \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246787\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4703703703703704,\n \"acc_stderr\": 0.030431963547936584,\n \"acc_norm\": 0.4703703703703704,\n \"acc_norm_stderr\": 0.030431963547936584\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.023005459446673936,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.023005459446673936\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248437,\n \"acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248437\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9247706422018349,\n \"acc_stderr\": 0.011308662537571762,\n \"acc_norm\": 0.9247706422018349,\n \"acc_norm_stderr\": 0.011308662537571762\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.919831223628692,\n \"acc_stderr\": 0.01767667999189163,\n \"acc_norm\": 0.919831223628692,\n \"acc_norm_stderr\": 0.01767667999189163\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7892376681614349,\n \"acc_stderr\": 0.02737309550054019,\n \"acc_norm\": 0.7892376681614349,\n \"acc_norm_stderr\": 0.02737309550054019\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342344,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342344\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9173553719008265,\n \"acc_stderr\": 0.025135382356604227,\n \"acc_norm\": 0.9173553719008265,\n \"acc_norm_stderr\": 0.025135382356604227\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.029239272675632748,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.029239272675632748\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.026321383198783674,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.026321383198783674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9223300970873787,\n \"acc_stderr\": 0.02650144078476276,\n \"acc_norm\": 0.9223300970873787,\n \"acc_norm_stderr\": 0.02650144078476276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n \"acc_stderr\": 0.01653462768431136,\n \"acc_norm\": 0.9316239316239316,\n \"acc_norm_stderr\": 0.01653462768431136\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.01046101533819307,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.01046101533819307\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135026,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135026\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6346368715083799,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.6346368715083799,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02027940293617458,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02027940293617458\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.02135534302826405,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.02135534302826405\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.01748643278588071,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.01748643278588071\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.028267657482650154,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.028267657482650154\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.605606258148631,\n \"acc_stderr\": 0.012482141665631176,\n \"acc_norm\": 0.605606258148631,\n \"acc_norm_stderr\": 0.012482141665631176\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.015422512066262554,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.015422512066262554\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8367346938775511,\n \"acc_stderr\": 0.023661699177098608,\n \"acc_norm\": 0.8367346938775511,\n \"acc_norm_stderr\": 0.023661699177098608\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015578,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015578\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41982864137086906,\n \"mc1_stderr\": 0.01727703030177577,\n \"mc2\": 0.5808164969122677,\n \"mc2_stderr\": 0.014977589951125109\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781103\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \"acc_stderr\": 0.012679297549515425\n }\n}\n```", "repo_url": "https://huggingface.co/ibndias/Nous-Hermes-2-MoE-2x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["**/details_harness|winogrande|5_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T00-41-59.190674.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T00_41_59.190674", "path": ["results_2024-01-14T00-41-59.190674.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T00-41-59.190674.parquet"]}]}]} | 2024-01-14T00:44:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B
Dataset automatically created during the evaluation run of model ibndias/Nous-Hermes-2-MoE-2x34B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T00:41:59.190674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B\n\n\n\nDataset automatically created during the evaluation run of model ibndias/Nous-Hermes-2-MoE-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T00:41:59.190674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ibndias/Nous-Hermes-2-MoE-2x34B\n\n\n\nDataset automatically created during the evaluation run of model ibndias/Nous-Hermes-2-MoE-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T00:41:59.190674(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0df2fdcf6a51f7f4adfad2553b10a5c5c5c8cad9 | # TLDR
* wikipedia page: [Road signs in Malaysia](https://en.wikipedia.org/wiki/Road_signs_in_Malaysia)
* num. of images: 365
* contributed to: https://github.com/orgs/malaysia-ai/projects/9/views/1?pane=issue&itemId=43619647
* date scraped: 14th January 2024 | wanadzhar913/wikipedia-malaysian-road-sign-images | [
"license:apache-2.0",
"region:us"
] | 2024-01-14T01:01:06+00:00 | {"license": "apache-2.0"} | 2024-01-14T01:20:28+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
| # TLDR
* wikipedia page: Road signs in Malaysia
* num. of images: 365
* contributed to: URL
* date scraped: 14th January 2024 | [
"# TLDR\n\n* wikipedia page: Road signs in Malaysia\n* num. of images: 365\n* contributed to: URL\n* date scraped: 14th January 2024"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"# TLDR\n\n* wikipedia page: Road signs in Malaysia\n* num. of images: 365\n* contributed to: URL\n* date scraped: 14th January 2024"
] |
b52abf8b1f4dee4cc46f3c1e41fd54a953963286 |
# Dataset of ting_an/定安/定安 (Azur Lane)
This is the dataset of ting_an/定安/定安 (Azur Lane), containing 40 images and their tags.
The core tags of this character are `breasts, earrings, bangs, black_hair, large_breasts, long_hair, mole, mole_under_eye, huge_breasts, purple_eyes, hair_ornament, pink_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 40 | 58.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 40 | 29.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 93 | 63.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 40 | 49.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 93 | 94.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ting_an_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ting_an_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, blush, chinese_clothes, cleavage, covered_navel, jewelry, looking_at_viewer, solo, cameltoe, curvy, parted_lips, thick_thighs, dress, hair_over_shoulder, revealing_clothes, smile, braid, leotard, mature_female, pelvic_curtain, see-through, sideboob, hand_up, indoors, plump, pussy_juice |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, breast_curtains, jewelry, looking_at_viewer, open_mouth, solo, blush, cleavage, covered_navel, fur_trim, pelvic_curtain, revealing_clothes, see-through, sideboob, china_dress, cowboy_shot, parted_bangs, smile, thighs, white_thighhighs, covered_nipples, detached_sleeves, hair_over_shoulder, mole_on_breast, underwear |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, 1boy, jewelry, solo_focus, cum_on_breasts, open_mouth, censored, heart, nipples, nude, sweat, symbol-shaped_pupils, bare_shoulders, breast_grab, breasts_squeezed_together, cleavage, cum_on_hair, facial, grabbing, looking_at_viewer, on_back, paizuri_under_clothes, penis, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | chinese_clothes | cleavage | covered_navel | jewelry | looking_at_viewer | solo | cameltoe | curvy | parted_lips | thick_thighs | dress | hair_over_shoulder | revealing_clothes | smile | braid | leotard | mature_female | pelvic_curtain | see-through | sideboob | hand_up | indoors | plump | pussy_juice | breast_curtains | open_mouth | fur_trim | china_dress | cowboy_shot | parted_bangs | thighs | white_thighhighs | covered_nipples | detached_sleeves | mole_on_breast | underwear | 1boy | solo_focus | cum_on_breasts | censored | heart | nipples | nude | sweat | symbol-shaped_pupils | breast_grab | breasts_squeezed_together | cum_on_hair | facial | grabbing | on_back | paizuri_under_clothes | penis |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:------------------|:-----------|:----------------|:----------|:--------------------|:-------|:-----------|:--------|:--------------|:---------------|:--------|:---------------------|:--------------------|:--------|:--------|:----------|:----------------|:-----------------|:--------------|:-----------|:----------|:----------|:--------|:--------------|:------------------|:-------------|:-----------|:--------------|:--------------|:---------------|:---------|:-------------------|:------------------|:-------------------|:-----------------|:------------|:-------|:-------------|:-----------------|:-----------|:--------|:----------|:-------|:--------|:-----------------------|:--------------|:----------------------------|:--------------|:---------|:-----------|:----------|:------------------------|:--------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | X | X | X | X | | | | | | X | X | X | | | | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | X | | X | X | | | | | | | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ting_an_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:17:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ting\_an/定安/定安 (Azur Lane)
=====================================
This is the dataset of ting\_an/定安/定安 (Azur Lane), containing 40 images and their tags.
The core tags of this character are 'breasts, earrings, bangs, black\_hair, large\_breasts, long\_hair, mole, mole\_under\_eye, huge\_breasts, purple\_eyes, hair\_ornament, pink\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
88dc31c15afe1acab837aa8816c384291ee0b6fc |
# Dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane)
This is the dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane), containing 23 images and their tags.
The core tags of this character are `breasts, long_hair, red_eyes, large_breasts, bangs, white_hair, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 23 | 52.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 23 | 21.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 49.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 23 | 43.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 83.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/marseillaise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/marseillaise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, blush, cleavage, detached_sleeves, looking_at_viewer, white_dress, black_thighhighs, navel, black_gloves, closed_mouth, hair_ornament, thighs, horns, smile, cowboy_shot, long_sleeves, panties, simple_background, white_background |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_pants, looking_at_viewer, solo, sports_bra, yoga_pants, ass, bare_shoulders, blush, no_shoes, sweat, sitting, closed_mouth, grey_hair, looking_back, white_socks |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | cleavage | detached_sleeves | looking_at_viewer | white_dress | black_thighhighs | navel | black_gloves | closed_mouth | hair_ornament | thighs | horns | smile | cowboy_shot | long_sleeves | panties | simple_background | white_background | black_pants | sports_bra | yoga_pants | ass | bare_shoulders | no_shoes | sweat | sitting | grey_hair | looking_back | white_socks |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:-------------------|:--------------------|:--------------|:-------------------|:--------|:---------------|:---------------|:----------------|:---------|:--------|:--------|:--------------|:---------------|:----------|:--------------------|:-------------------|:--------------|:-------------|:-------------|:------|:-----------------|:-----------|:--------|:----------|:------------|:---------------|:--------------|
| 0 | 8 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | X | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/marseillaise_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:16:12+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane)
===============================================
This is the dataset of marseillaise/マルセイエーズ/马赛曲 (Azur Lane), containing 23 images and their tags.
The core tags of this character are 'breasts, long\_hair, red\_eyes, large\_breasts, bangs, white\_hair, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
4c40149199069b8aaf36738e227397d8d1899765 |
# Dataset of aulick/オーリック/奥利克 (Azur Lane)
This is the dataset of aulick/オーリック/奥利克 (Azur Lane), containing 10 images and their tags.
The core tags of this character are `hair_ornament, hairclip, short_hair, hat, beret, bangs, green_eyes, hair_between_eyes, red_hair, sailor_hat, white_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 7.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 5.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 20 | 9.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 7.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 20 | 12.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/aulick_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/aulick_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, solo, open_mouth, sailor_collar, looking_at_viewer, sailor_dress, white_gloves, yellow_neckerchief, :d, simple_background, sleeveless_dress, white_background, white_thighhighs, blue_dress, feathers, frilled_dress, hat_feather, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | open_mouth | sailor_collar | looking_at_viewer | sailor_dress | white_gloves | yellow_neckerchief | :d | simple_background | sleeveless_dress | white_background | white_thighhighs | blue_dress | feathers | frilled_dress | hat_feather | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:----------------|:--------------------|:---------------|:---------------|:---------------------|:-----|:--------------------|:-------------------|:-------------------|:-------------------|:-------------|:-----------|:----------------|:--------------|:----------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/aulick_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:45+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:11:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of aulick/オーリック/奥利克 (Azur Lane)
=======================================
This is the dataset of aulick/オーリック/奥利克 (Azur Lane), containing 10 images and their tags.
The core tags of this character are 'hair\_ornament, hairclip, short\_hair, hat, beret, bangs, green\_eyes, hair\_between\_eyes, red\_hair, sailor\_hat, white\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7191ed3d818fd4ba22c2221211da5c41106cafe5 |
# Dataset of hans_ludemann/ハンス・リューデマン/Z18 (Azur Lane)
This is the dataset of hans_ludemann/ハンス・リューデマン/Z18 (Azur Lane), containing 22 images and their tags.
The core tags of this character are `blonde_hair, long_hair, twintails, blue_eyes, hair_ornament, hairclip, hat, bow, fang, breasts, hair_between_eyes, small_breasts, bangs, very_long_hair, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 29.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 39.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 27.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 54.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hans_ludemann_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hans_ludemann_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, 1girl, solo, looking_at_viewer, navel, open_mouth, fingerless_gloves, smile, black_gloves, skirt, white_panties, black_thighhighs, jacket, open_clothes, training_bra |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | solo | looking_at_viewer | navel | open_mouth | fingerless_gloves | smile | black_gloves | skirt | white_panties | black_thighhighs | jacket | open_clothes | training_bra |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:--------------------|:--------|:-------------|:--------------------|:--------|:---------------|:--------|:----------------|:-------------------|:---------|:---------------|:---------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hans_ludemann_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:07:50+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:29:33+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hans\_ludemann/ハンス・リューデマン/Z18 (Azur Lane)
====================================================
This is the dataset of hans\_ludemann/ハンス・リューデマン/Z18 (Azur Lane), containing 22 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, twintails, blue\_eyes, hair\_ornament, hairclip, hat, bow, fang, breasts, hair\_between\_eyes, small\_breasts, bangs, very\_long\_hair, black\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b9bdd6281ebe5a55dfff6352bd03633d537829d4 |
# Dataset of oyashio/親潮/亲潮 (Azur Lane)
This is the dataset of oyashio/親潮/亲潮 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `hair_ornament, hair_bun, x_hair_ornament, braid, bangs, fang, hair_between_eyes, horns, double_bun, blonde_hair, blue_eyes, pointy_ears, sidelocks, breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 14.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 8.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 17.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 24.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/oyashio_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/oyashio_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, bare_shoulders, blush, detached_sleeves, japanese_clothes, long_sleeves, looking_at_viewer, open_mouth, simple_background, solo, white_background, wide_sleeves, :d, black_gloves, black_skirt, pleated_skirt, single_thighhigh, sleeveless, standing, uneven_legwear, full_body, partially_fingerless_gloves, shirt, side-tie_panties, single_kneehigh, black_footwear, bridal_gauntlets, crossed_bangs, green_eyes, index_finger_raised, jewelry, legs_apart, long_hair, machinery, magatama, minigirl, miniskirt, mismatched_legwear, oni_horns, pigeon-toed, sash, side_slit, single_glove, single_hair_bun, small_breasts, torpedo_tubes, turret, white_sleeves, zettai_ryouiki |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | blush | detached_sleeves | japanese_clothes | long_sleeves | looking_at_viewer | open_mouth | simple_background | solo | white_background | wide_sleeves | :d | black_gloves | black_skirt | pleated_skirt | single_thighhigh | sleeveless | standing | uneven_legwear | full_body | partially_fingerless_gloves | shirt | side-tie_panties | single_kneehigh | black_footwear | bridal_gauntlets | crossed_bangs | green_eyes | index_finger_raised | jewelry | legs_apart | long_hair | machinery | magatama | minigirl | miniskirt | mismatched_legwear | oni_horns | pigeon-toed | sash | side_slit | single_glove | single_hair_bun | small_breasts | torpedo_tubes | turret | white_sleeves | zettai_ryouiki |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------|:-------------------|:-------------------|:---------------|:--------------------|:-------------|:--------------------|:-------|:-------------------|:---------------|:-----|:---------------|:--------------|:----------------|:-------------------|:-------------|:-----------|:-----------------|:------------|:------------------------------|:--------|:-------------------|:------------------|:-----------------|:-------------------|:----------------|:-------------|:----------------------|:----------|:-------------|:------------|:------------|:-----------|:-----------|:------------|:---------------------|:------------|:--------------|:-------|:------------|:---------------|:------------------|:----------------|:----------------|:---------|:----------------|:-----------------|
| 0 | 5 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/oyashio_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:08:34+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:11:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of oyashio/親潮/亲潮 (Azur Lane)
====================================
This is the dataset of oyashio/親潮/亲潮 (Azur Lane), containing 12 images and their tags.
The core tags of this character are 'hair\_ornament, hair\_bun, x\_hair\_ornament, braid, bangs, fang, hair\_between\_eyes, horns, double\_bun, blonde\_hair, blue\_eyes, pointy\_ears, sidelocks, breasts, brown\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9c6a8408845962ee5f63b19162466fbb69d7535e |
# Dataset of mary_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane)
This is the dataset of mary_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane), containing 27 images and their tags.
The core tags of this character are `blue_hair, breasts, horns, large_breasts, long_hair, pointy_ears, blue_eyes, bangs, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 27 | 58.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 27 | 25.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 68 | 54.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 27 | 46.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 68 | 86.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/mary_celeste_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/mary_celeste_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, long_sleeves, navel, solo, looking_at_viewer, smile, torn_clothes, belt, black_coat, open_mouth, open_coat, tentacles, thighs, barefoot, blush, revealing_clothes, sitting, stomach, fang, water |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, solo, black_dress, wings, covered_navel, parted_lips, bare_shoulders, barefoot, holding, sleeveless_dress, underboob_cutout, earrings, full_body, sideboob |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | long_sleeves | navel | solo | looking_at_viewer | smile | torn_clothes | belt | black_coat | open_mouth | open_coat | tentacles | thighs | barefoot | blush | revealing_clothes | sitting | stomach | fang | water | black_dress | wings | covered_navel | parted_lips | bare_shoulders | holding | sleeveless_dress | underboob_cutout | earrings | full_body | sideboob |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-------|:--------------------|:--------|:---------------|:-------|:-------------|:-------------|:------------|:------------|:---------|:-----------|:--------|:--------------------|:----------|:----------|:-------|:--------|:--------------|:--------|:----------------|:--------------|:-----------------|:----------|:-------------------|:-------------------|:-----------|:------------|:-----------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | X | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/mary_celeste_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:08:43+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:17:30+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of mary\_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane)
=======================================================
This is the dataset of mary\_celeste/メアリー・セレスト/玛丽·西莱斯特号 (Azur Lane), containing 27 images and their tags.
The core tags of this character are 'blue\_hair, breasts, horns, large\_breasts, long\_hair, pointy\_ears, blue\_eyes, bangs, hair\_between\_eyes, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f93286104328116a0382df20e6b8a42e09e04ab4 |
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6323249282667325,
"acc_stderr": 0.03235123186693868,
"acc_norm": 0.63602882598941,
"acc_norm_stderr": 0.03299471578731984,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.659556313993174,
"acc_norm_stderr": 0.013847460518892978
},
"harness|hellaswag|10": {
"acc": 0.6661023700458076,
"acc_stderr": 0.004706398252382464,
"acc_norm": 0.8579964150567616,
"acc_norm_stderr": 0.0034834044902359936
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440679,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440679
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848033,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899129,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899129
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153266,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399682,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399682
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825362,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825362
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.01738476747898622,
"mc2": 0.6075861082832835,
"mc2_stderr": 0.015099206529299735
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.4829416224412434,
"acc_stderr": 0.013764467123761318
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo | [
"region:us"
] | 2024-01-14T01:16:11+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-2x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:13:57.359475](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-2x7b-dpo/blob/main/results_2024-01-14T01-13-57.359475.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6323249282667325,\n \"acc_stderr\": 0.03235123186693868,\n \"acc_norm\": 0.63602882598941,\n \"acc_norm_stderr\": 0.03299471578731984,\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.659556313993174,\n \"acc_norm_stderr\": 0.013847460518892978\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6661023700458076,\n \"acc_stderr\": 0.004706398252382464,\n \"acc_norm\": 0.8579964150567616,\n \"acc_norm_stderr\": 0.0034834044902359936\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440679,\n \"acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440679\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478923,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478923\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848033,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848033\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n \"acc_stderr\": 0.013964393769899129,\n \"acc_norm\": 0.8122605363984674,\n \"acc_norm_stderr\": 0.013964393769899129\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n \"acc_stderr\": 0.026236965881153266,\n \"acc_norm\": 0.6913183279742765,\n \"acc_norm_stderr\": 0.026236965881153266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n \"acc_stderr\": 0.012697046024399682,\n \"acc_norm\": 0.44654498044328556,\n \"acc_norm_stderr\": 0.012697046024399682\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825362,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825362\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n \"mc1_stderr\": 0.01738476747898622,\n \"mc2\": 0.6075861082832835,\n \"mc2_stderr\": 0.015099206529299735\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4829416224412434,\n \"acc_stderr\": 0.013764467123761318\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-dolphin-mixtral-2x7b-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-13-57.359475.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_13_57.359475", "path": ["results_2024-01-14T01-13-57.359475.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-13-57.359475.parquet"]}]}]} | 2024-01-14T01:16:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo
Dataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-2x7b-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T01:13:57.359475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-2x7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:13:57.359475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-2x7b-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-2x7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:13:57.359475(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
5269531984378d65735f62af11cabe5ec9fc68dc |
# Dataset of ns2000/NS2000/NS2000 (Girls' Frontline)
This is the dataset of ns2000/NS2000/NS2000 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `animal_ears, breasts, dark-skinned_female, dark_skin, rabbit_ears, red_eyes, large_breasts, long_hair, white_hair, bangs, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 13.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 8.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 28 | 16.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 12.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 28 | 21.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ns2000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ns2000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, navel, cleavage, looking_at_viewer, simple_background, open_mouth, smile, white_background, blush, gloves, shorts |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | navel | cleavage | looking_at_viewer | simple_background | open_mouth | smile | white_background | blush | gloves | shorts |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:-----------|:--------------------|:--------------------|:-------------|:--------|:-------------------|:--------|:---------|:---------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ns2000_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:20:32+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ns2000/NS2000/NS2000 (Girls' Frontline)
==================================================
This is the dataset of ns2000/NS2000/NS2000 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are 'animal\_ears, breasts, dark-skinned\_female, dark\_skin, rabbit\_ears, red\_eyes, large\_breasts, long\_hair, white\_hair, bangs, grey\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
e0bfe54a74e818bf98efc539a898da2ae6e1051f |
# Dataset of m1014/M1014/M1014 (Girls' Frontline)
This is the dataset of m1014/M1014/M1014 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are `long_hair, bangs, hair_between_eyes, heterochromia, red_eyes, hair_ornament, breasts, yellow_eyes, brown_hair, hat, black_hair, headphones, hairclip, large_breasts, very_long_hair, sidelocks, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 30 | 33.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 30 | 21.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 62 | 41.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 30 | 30.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 62 | 54.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1014_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1014_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_gloves, looking_at_viewer, closed_mouth, fingerless_gloves, holding_gun, cleavage, dress, long_sleeves, beret, blush, brown_eyes, shotgun_shell, wide_sleeves, character_name, collarbone, medium_breasts, open_jacket, thigh_strap, white_background, standing |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, black_kimono, blush, hair_flower, long_sleeves, obi, solo, wide_sleeves, standing, full_body, gun, looking_at_viewer, sandals, single_hair_bun, tabi, zouri, cleavage, closed_mouth, holding_umbrella, oil-paper_umbrella, open_mouth, print_kimono, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | looking_at_viewer | closed_mouth | fingerless_gloves | holding_gun | cleavage | dress | long_sleeves | beret | blush | brown_eyes | shotgun_shell | wide_sleeves | character_name | collarbone | medium_breasts | open_jacket | thigh_strap | white_background | standing | black_kimono | hair_flower | obi | full_body | gun | sandals | single_hair_bun | tabi | zouri | holding_umbrella | oil-paper_umbrella | open_mouth | print_kimono | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:---------------|:--------------------|:--------------|:-----------|:--------|:---------------|:--------|:--------|:-------------|:----------------|:---------------|:-----------------|:-------------|:-----------------|:--------------|:--------------|:-------------------|:-----------|:---------------|:--------------|:------|:------------|:------|:----------|:------------------|:-------|:--------|:-------------------|:---------------------|:-------------|:---------------|:--------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 6 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | | X | | X | | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m1014_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:23:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m1014/M1014/M1014 (Girls' Frontline)
===============================================
This is the dataset of m1014/M1014/M1014 (Girls' Frontline), containing 30 images and their tags.
The core tags of this character are 'long\_hair, bangs, hair\_between\_eyes, heterochromia, red\_eyes, hair\_ornament, breasts, yellow\_eyes, brown\_hair, hat, black\_hair, headphones, hairclip, large\_breasts, very\_long\_hair, sidelocks, x\_hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
43235b8322f7b0494f42c8f4889d0d27807c7148 |
# Dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline)
This is the dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline), containing 28 images and their tags.
The core tags of this character are `blonde_hair, long_hair, hair_ornament, red_eyes, breasts, bangs, hairclip, small_breasts, pointy_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 28 | 26.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 28 | 20.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 60 | 37.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 28 | 25.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 60 | 45.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m1919a4_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m1919a4_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, open_mouth, solo, navel, blush, barefoot, cape, full_body, hair_bow, vampire, wrist_cuffs, fangs, skull_hair_ornament, fingernails, nipples, red_bow, simple_background, smile, toenail_polish, white_background |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, machine_gun, solo, long_sleeves, looking_at_viewer, smile, white_pantyhose, ammunition_belt, brown_headwear, brown_jacket, bullet, closed_mouth, garrison_cap, holding_gun, open_jacket, open_mouth, pink_eyes, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | open_mouth | solo | navel | blush | barefoot | cape | full_body | hair_bow | vampire | wrist_cuffs | fangs | skull_hair_ornament | fingernails | nipples | red_bow | simple_background | smile | toenail_polish | white_background | machine_gun | long_sleeves | white_pantyhose | ammunition_belt | brown_headwear | brown_jacket | bullet | closed_mouth | garrison_cap | holding_gun | open_jacket | pink_eyes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:-------|:--------|:--------|:-----------|:-------|:------------|:-----------|:----------|:--------------|:--------|:----------------------|:--------------|:----------|:----------|:--------------------|:--------|:-----------------|:-------------------|:--------------|:---------------|:------------------|:------------------|:-----------------|:---------------|:---------|:---------------|:---------------|:--------------|:--------------|:------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | | | | | | | | | | | | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m1919a4_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:12+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:23:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline)
=====================================================
This is the dataset of m1919a4/M1919A4/M1919A4 (Girls' Frontline), containing 28 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, hair\_ornament, red\_eyes, breasts, bangs, hairclip, small\_breasts, pointy\_ears', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
28d3d4f7c157eb6e36755f7e1ce86e333ae1e00e |
# Dataset of g43/G43/G43 (Girls' Frontline)
This is the dataset of g43/G43/G43 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `blonde_hair, hat, blue_eyes, black_headwear, braid, short_hair, military_hat, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 11.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 7.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 14.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 10.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 21.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/g43_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/g43_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, smile, looking_at_viewer, rifle, simple_background, white_background, closed_mouth, holding_gun, military_uniform, thighhighs, gloves, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | smile | looking_at_viewer | rifle | simple_background | white_background | closed_mouth | holding_gun | military_uniform | thighhighs | gloves | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:--------|:--------------------|:-------------------|:---------------|:--------------|:-------------------|:-------------|:---------|:----------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/g43_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:36+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of g43/G43/G43 (Girls' Frontline)
=========================================
This is the dataset of g43/G43/G43 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are 'blonde\_hair, hat, blue\_eyes, black\_headwear, braid, short\_hair, military\_hat, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9936d437f7eb4a33cfa36237d58b89397d18c5ed |
# Dataset of caws/CAWS/CAWS (Girls' Frontline)
This is the dataset of caws/CAWS/CAWS (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `black_hair, yellow_eyes, bangs, hair_bun, breasts, braid, medium_breasts, short_hair, eyeshadow, goggles_on_head, side_braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 21.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 13.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 22.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 19.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 31.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/caws_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/caws_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, goggles, makeup, blush, gloves, jacket, looking_at_viewer, holding_gun, hood_down, long_sleeves, open_clothes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | goggles | makeup | blush | gloves | jacket | looking_at_viewer | holding_gun | hood_down | long_sleeves | open_clothes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:---------|:--------|:---------|:---------|:--------------------|:--------------|:------------|:---------------|:---------------|
| 0 | 19 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/caws_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of caws/CAWS/CAWS (Girls' Frontline)
============================================
This is the dataset of caws/CAWS/CAWS (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are 'black\_hair, yellow\_eyes, bangs, hair\_bun, breasts, braid, medium\_breasts, short\_hair, eyeshadow, goggles\_on\_head, side\_braid', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1146ba4936e4f7619c1a0e52c6ab4b7277298156 |
# Dataset of ak_74u/AK-74U/AK-74U (Girls' Frontline)
This is the dataset of ak_74u/AK-74U/AK-74U (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, breasts, bangs, hair_between_eyes, long_hair, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 15.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 26 | 18.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 26 | 26.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ak_74u_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ak_74u_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, choker, cleavage, assault_rifle, black_jacket, white_background, fingerless_gloves, holding_gun, open_jacket, shorts, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | choker | cleavage | assault_rifle | black_jacket | white_background | fingerless_gloves | holding_gun | open_jacket | shorts | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-----------|:----------------|:---------------|:-------------------|:--------------------|:--------------|:--------------|:---------|:--------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ak_74u_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:21:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ak\_74u/AK-74U/AK-74U (Girls' Frontline)
===================================================
This is the dataset of ak\_74u/AK-74U/AK-74U (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, breasts, bangs, hair\_between\_eyes, long\_hair, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7b48fc3f35a7df8086e0c71128d15809e3503cd6 |
# Dataset of fg42/FG42/FG42 (Girls' Frontline)
This is the dataset of fg42/FG42/FG42 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `blonde_hair, hat, blue_eyes, bangs, garrison_cap, long_hair, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 18.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 7.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 14.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 27.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fg42_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fg42_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, white_gloves, black_pantyhose, blue_skirt, holding_gun, rifle, simple_background, standing, uniform, white_background, white_shirt, belt, black_necktie, blush, closed_mouth, full_body, pouch, short_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | white_gloves | black_pantyhose | blue_skirt | holding_gun | rifle | simple_background | standing | uniform | white_background | white_shirt | belt | black_necktie | blush | closed_mouth | full_body | pouch | short_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------------|:------------------|:-------------|:--------------|:--------|:--------------------|:-----------|:----------|:-------------------|:--------------|:-------|:----------------|:--------|:---------------|:------------|:--------|:----------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/fg42_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:18:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:22:11+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fg42/FG42/FG42 (Girls' Frontline)
============================================
This is the dataset of fg42/FG42/FG42 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are 'blonde\_hair, hat, blue\_eyes, bangs, garrison\_cap, long\_hair, medium\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5ee5175c4949278c9a061904ffb3f23223e3abf8 |
# Dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline)
This is the dataset of px4_storm/Px4ストーム/Px4风暴 (Girls' Frontline), containing 31 images and their tags.
The core tags of this character are `green_eyes, blonde_hair, breasts, bangs, large_breasts, mole_under_eye, mole, short_hair, hair_between_eyes, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 31 | 40.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 31 | 22.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 80 | 48.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 31 | 35.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 80 | 68.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/px4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/px4_storm_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, gloves, solo, hood_up, blush, looking_at_viewer, white_background, dress, character_name, handgun, black_coat, holding_gun, skindentation, thigh_strap, thighs |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, blush, looking_at_viewer, navel, solo, white_bikini, cleavage, collarbone, hairclip, halterneck, simple_background, white_background, bare_legs, closed_mouth, feet, full_body, holding, o-ring_bikini, orange_hair, parted_lips, sandals, sarong, sky, smile, standing, stomach, thighs, toes, wet, white_footwear |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, blush, red_sweater, smile, looking_at_viewer, solo, turtleneck, black_pantyhose, beret, earrings, necklace, panties, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | gloves | solo | hood_up | blush | looking_at_viewer | white_background | dress | character_name | handgun | black_coat | holding_gun | skindentation | thigh_strap | thighs | bare_shoulders | navel | white_bikini | cleavage | collarbone | hairclip | halterneck | simple_background | bare_legs | closed_mouth | feet | full_body | holding | o-ring_bikini | orange_hair | parted_lips | sandals | sarong | sky | smile | standing | stomach | toes | wet | white_footwear | red_sweater | turtleneck | black_pantyhose | beret | earrings | necklace | panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:-------|:----------|:--------|:--------------------|:-------------------|:--------|:-----------------|:----------|:-------------|:--------------|:----------------|:--------------|:---------|:-----------------|:--------|:---------------|:-----------|:-------------|:-----------|:-------------|:--------------------|:------------|:---------------|:-------|:------------|:----------|:----------------|:--------------|:--------------|:----------|:---------|:------|:--------|:-----------|:----------|:-------|:------|:-----------------|:--------------|:-------------|:------------------|:--------|:-----------|:-----------|:----------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | X | | X | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | | X | | X | X | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | X | | | | | | X | X | X | X | X | X | X |
| CyberHarem/px4_storm_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T01:19:17+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T01:26:35+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of px4\_storm/Px4ストーム/Px4风暴 (Girls' Frontline)
======================================================
This is the dataset of px4\_storm/Px4ストーム/Px4风暴 (Girls' Frontline), containing 31 images and their tags.
The core tags of this character are 'green\_eyes, blonde\_hair, breasts, bangs, large\_breasts, mole\_under\_eye, mole, short\_hair, hair\_between\_eyes, medium\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
98556e909e16623c2329b4bcd2ad6d3bbf0415a2 |
# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6367747877161951,
"acc_stderr": 0.03232816338890694,
"acc_norm": 0.6393383626953215,
"acc_norm_stderr": 0.03297276004070419,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.014301752223279542,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955009
},
"harness|hellaswag|10": {
"acc": 0.6549492133041227,
"acc_stderr": 0.004744132825391518,
"acc_norm": 0.8485361481776539,
"acc_norm_stderr": 0.0035776774950640874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.02911661760608301,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.02911661760608301
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612896,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612896
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906944,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906944
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508283,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508283
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464492,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464492
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5378477391082209,
"mc2_stderr": 0.015247687104643274
},
"harness|winogrande|5": {
"acc": 0.7845303867403315,
"acc_stderr": 0.011555295286059282
},
"harness|gsm8k|5": {
"acc": 0.5663381349507203,
"acc_stderr": 0.013650728047064695
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b | [
"region:us"
] | 2024-01-14T01:28:14+00:00 | {"pretty_name": "Evaluation run of macadeliccc/polyglot-math-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/polyglot-math-4x7b](https://huggingface.co/macadeliccc/polyglot-math-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:25:55.830403](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__polyglot-math-4x7b/blob/main/results_2024-01-14T01-25-55.830403.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6367747877161951,\n \"acc_stderr\": 0.03232816338890694,\n \"acc_norm\": 0.6393383626953215,\n \"acc_norm_stderr\": 0.03297276004070419,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.014301752223279542,\n \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955009\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6549492133041227,\n \"acc_stderr\": 0.004744132825391518,\n \"acc_norm\": 0.8485361481776539,\n \"acc_norm_stderr\": 0.0035776774950640874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.02911661760608301,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.02911661760608301\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612896,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612896\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906944,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906944\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508283,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508283\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464492,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464492\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5378477391082209,\n \"mc2_stderr\": 0.015247687104643274\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7845303867403315,\n \"acc_stderr\": 0.011555295286059282\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5663381349507203,\n \"acc_stderr\": 0.013650728047064695\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/polyglot-math-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-25-55.830403.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_25_55.830403", "path": ["results_2024-01-14T01-25-55.830403.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-25-55.830403.parquet"]}]}]} | 2024-01-14T01:28:36+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b
Dataset automatically created during the evaluation run of model macadeliccc/polyglot-math-4x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T01:25:55.830403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/polyglot-math-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:25:55.830403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/polyglot-math-4x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/polyglot-math-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:25:55.830403(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
669cb61386c31208043498a84c1665a5595ef61a |
# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6383969687290681,
"acc_stderr": 0.032222378716622334,
"acc_norm": 0.6424348983154926,
"acc_norm_stderr": 0.03285947296719794,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|arc:challenge|25": {
"acc": 0.6092150170648464,
"acc_stderr": 0.01425856388051378,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859857
},
"harness|hellaswag|10": {
"acc": 0.6581358295160327,
"acc_stderr": 0.0047336492748145075,
"acc_norm": 0.8498307110137423,
"acc_norm_stderr": 0.0035650718701954478
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343135,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343135
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.0376017800602662,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.0376017800602662
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8339719029374202,
"acc_stderr": 0.0133064782430663,
"acc_norm": 0.8339719029374202,
"acc_norm_stderr": 0.0133064782430663
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265016,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265016
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.0160837499868537,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.0160837499868537
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824876,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824876
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786855,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786855
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5546852358397387,
"mc2_stderr": 0.015162772354647294
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497811
},
"harness|gsm8k|5": {
"acc": 0.4844579226686884,
"acc_stderr": 0.013765829454512891
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b | [
"region:us"
] | 2024-01-14T01:30:23+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-polyglot-4x7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-polyglot-4x7b](https://huggingface.co/macadeliccc/laser-polyglot-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:28:04.517036](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-polyglot-4x7b/blob/main/results_2024-01-14T01-28-04.517036.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6383969687290681,\n \"acc_stderr\": 0.032222378716622334,\n \"acc_norm\": 0.6424348983154926,\n \"acc_norm_stderr\": 0.03285947296719794,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6092150170648464,\n \"acc_stderr\": 0.01425856388051378,\n \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n \"acc_stderr\": 0.0047336492748145075,\n \"acc_norm\": 0.8498307110137423,\n \"acc_norm_stderr\": 0.0035650718701954478\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343135,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343135\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.0376017800602662,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.0376017800602662\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8339719029374202,\n \"acc_stderr\": 0.0133064782430663,\n \"acc_norm\": 0.8339719029374202,\n \"acc_norm_stderr\": 0.0133064782430663\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265016,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265016\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n \"acc_stderr\": 0.0160837499868537,\n \"acc_norm\": 0.36312849162011174,\n \"acc_norm_stderr\": 0.0160837499868537\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824876,\n \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824876\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786855,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786855\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5546852358397387,\n \"mc2_stderr\": 0.015162772354647294\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497811\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4844579226686884,\n \"acc_stderr\": 0.013765829454512891\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-polyglot-4x7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-28-04.517036.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_28_04.517036", "path": ["results_2024-01-14T01-28-04.517036.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-28-04.517036.parquet"]}]}]} | 2024-01-14T01:30:47+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b
Dataset automatically created during the evaluation run of model macadeliccc/laser-polyglot-4x7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T01:28:04.517036(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-polyglot-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:28:04.517036(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/laser-polyglot-4x7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-polyglot-4x7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:28:04.517036(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
759795862113fce0d53c5a20825bd5e51d8ce2c3 |
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687638749469244,
"acc_stderr": 0.02791668972955577,
"acc_norm": 0.7731851983230489,
"acc_norm_stderr": 0.028441222412067358,
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.7073378839590444,
"acc_norm_stderr": 0.013295916103619422
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8568014339772954,
"acc_norm_stderr": 0.0034955936625207526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.881578947368421,
"acc_stderr": 0.02629399585547494,
"acc_norm": 0.881578947368421,
"acc_norm_stderr": 0.02629399585547494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8188679245283019,
"acc_stderr": 0.023702963526757798,
"acc_norm": 0.8188679245283019,
"acc_norm_stderr": 0.023702963526757798
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9166666666666666,
"acc_stderr": 0.023112508176051236,
"acc_norm": 0.9166666666666666,
"acc_norm_stderr": 0.023112508176051236
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5392156862745098,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.5392156862745098,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653695,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349414,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7724137931034483,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.7724137931034483,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6931216931216931,
"acc_stderr": 0.02375292871211213,
"acc_norm": 0.6931216931216931,
"acc_norm_stderr": 0.02375292871211213
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488313,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6502463054187192,
"acc_stderr": 0.03355400904969566,
"acc_norm": 0.6502463054187192,
"acc_norm_stderr": 0.03355400904969566
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9292929292929293,
"acc_stderr": 0.018263105420199505,
"acc_norm": 0.9292929292929293,
"acc_norm_stderr": 0.018263105420199505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.01889552448260495,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.01889552448260495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.030182099804387262,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.030182099804387262
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8487394957983193,
"acc_stderr": 0.023274255898707946,
"acc_norm": 0.8487394957983193,
"acc_norm_stderr": 0.023274255898707946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5364238410596026,
"acc_stderr": 0.04071636065944217,
"acc_norm": 0.5364238410596026,
"acc_norm_stderr": 0.04071636065944217
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9192660550458716,
"acc_stderr": 0.011680172292862088,
"acc_norm": 0.9192660550458716,
"acc_norm_stderr": 0.011680172292862088
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9113924050632911,
"acc_stderr": 0.018498315206865384,
"acc_norm": 0.9113924050632911,
"acc_norm_stderr": 0.018498315206865384
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280226,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280226
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8854961832061069,
"acc_stderr": 0.027927473753597446,
"acc_norm": 0.8854961832061069,
"acc_norm_stderr": 0.027927473753597446
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8711656441717791,
"acc_stderr": 0.02632138319878367,
"acc_norm": 0.8711656441717791,
"acc_norm_stderr": 0.02632138319878367
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.04572372358737431,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.04572372358737431
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253876,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253876
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9067688378033205,
"acc_stderr": 0.010397417087292849,
"acc_norm": 0.9067688378033205,
"acc_norm_stderr": 0.010397417087292849
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8179190751445087,
"acc_stderr": 0.02077676110251298,
"acc_norm": 0.8179190751445087,
"acc_norm_stderr": 0.02077676110251298
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.794413407821229,
"acc_stderr": 0.013516116210724202,
"acc_norm": 0.794413407821229,
"acc_norm_stderr": 0.013516116210724202
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8366013071895425,
"acc_stderr": 0.021170623011213505,
"acc_norm": 0.8366013071895425,
"acc_norm_stderr": 0.021170623011213505
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8295819935691319,
"acc_stderr": 0.021355343028264053,
"acc_norm": 0.8295819935691319,
"acc_norm_stderr": 0.021355343028264053
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.648936170212766,
"acc_stderr": 0.028473501272963758,
"acc_norm": 0.648936170212766,
"acc_norm_stderr": 0.028473501272963758
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6029986962190352,
"acc_stderr": 0.012496346982909556,
"acc_norm": 0.6029986962190352,
"acc_norm_stderr": 0.012496346982909556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.022161462608068522,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.022161462608068522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8448979591836735,
"acc_stderr": 0.0231747988612186,
"acc_norm": 0.8448979591836735,
"acc_norm_stderr": 0.0231747988612186
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4969400244798042,
"mc1_stderr": 0.01750317326096062,
"mc2": 0.6709148255495884,
"mc2_stderr": 0.014645409374455808
},
"harness|winogrande|5": {
"acc": 0.8437253354380426,
"acc_stderr": 0.010205351791873492
},
"harness|gsm8k|5": {
"acc": 0.6626231993934799,
"acc_stderr": 0.013023665136222096
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp | [
"region:us"
] | 2024-01-14T01:58:30+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-34B-Slerp](https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:56:18.562449](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-34B-Slerp/blob/main/results_2024-01-14T01-56-18.562449.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687638749469244,\n \"acc_stderr\": 0.02791668972955577,\n \"acc_norm\": 0.7731851983230489,\n \"acc_norm_stderr\": 0.028441222412067358,\n \"mc1\": 0.4969400244798042,\n \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n \"acc_norm\": 0.7073378839590444,\n \"acc_norm_stderr\": 0.013295916103619422\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8568014339772954,\n \"acc_norm_stderr\": 0.0034955936625207526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.881578947368421,\n \"acc_stderr\": 0.02629399585547494,\n \"acc_norm\": 0.881578947368421,\n \"acc_norm_stderr\": 0.02629399585547494\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8188679245283019,\n \"acc_stderr\": 0.023702963526757798,\n \"acc_norm\": 0.8188679245283019,\n \"acc_norm_stderr\": 0.023702963526757798\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9166666666666666,\n \"acc_stderr\": 0.023112508176051236,\n \"acc_norm\": 0.9166666666666666,\n \"acc_norm_stderr\": 0.023112508176051236\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5392156862745098,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.5392156862745098,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653695,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653695\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349414,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349414\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7724137931034483,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.7724137931034483,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6931216931216931,\n \"acc_stderr\": 0.02375292871211213,\n \"acc_norm\": 0.6931216931216931,\n \"acc_norm_stderr\": 0.02375292871211213\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6502463054187192,\n \"acc_stderr\": 0.03355400904969566,\n \"acc_norm\": 0.6502463054187192,\n \"acc_norm_stderr\": 0.03355400904969566\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9292929292929293,\n \"acc_stderr\": 0.018263105420199505,\n \"acc_norm\": 0.9292929292929293,\n \"acc_norm_stderr\": 0.018263105420199505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.01889552448260495,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.01889552448260495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.42962962962962964,\n \"acc_stderr\": 0.030182099804387262,\n \"acc_norm\": 0.42962962962962964,\n \"acc_norm_stderr\": 0.030182099804387262\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8487394957983193,\n \"acc_stderr\": 0.023274255898707946,\n \"acc_norm\": 0.8487394957983193,\n \"acc_norm_stderr\": 0.023274255898707946\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5364238410596026,\n \"acc_stderr\": 0.04071636065944217,\n \"acc_norm\": 0.5364238410596026,\n \"acc_norm_stderr\": 0.04071636065944217\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9192660550458716,\n \"acc_stderr\": 0.011680172292862088,\n \"acc_norm\": 0.9192660550458716,\n \"acc_norm_stderr\": 0.011680172292862088\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9113924050632911,\n \"acc_stderr\": 0.018498315206865384,\n \"acc_norm\": 0.9113924050632911,\n \"acc_norm_stderr\": 0.018498315206865384\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8854961832061069,\n \"acc_stderr\": 0.027927473753597446,\n \"acc_norm\": 0.8854961832061069,\n \"acc_norm_stderr\": 0.027927473753597446\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8711656441717791,\n \"acc_stderr\": 0.02632138319878367,\n \"acc_norm\": 0.8711656441717791,\n \"acc_norm_stderr\": 0.02632138319878367\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n \"acc_stderr\": 0.04572372358737431,\n \"acc_norm\": 0.6339285714285714,\n \"acc_norm_stderr\": 0.04572372358737431\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n \"acc_stderr\": 0.015537514263253876,\n \"acc_norm\": 0.9401709401709402,\n \"acc_norm_stderr\": 0.015537514263253876\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9067688378033205,\n \"acc_stderr\": 0.010397417087292849,\n \"acc_norm\": 0.9067688378033205,\n \"acc_norm_stderr\": 0.010397417087292849\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8179190751445087,\n \"acc_stderr\": 0.02077676110251298,\n \"acc_norm\": 0.8179190751445087,\n \"acc_norm_stderr\": 0.02077676110251298\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.794413407821229,\n \"acc_stderr\": 0.013516116210724202,\n \"acc_norm\": 0.794413407821229,\n \"acc_norm_stderr\": 0.013516116210724202\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8366013071895425,\n \"acc_stderr\": 0.021170623011213505,\n \"acc_norm\": 0.8366013071895425,\n \"acc_norm_stderr\": 0.021170623011213505\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8295819935691319,\n \"acc_stderr\": 0.021355343028264053,\n \"acc_norm\": 0.8295819935691319,\n \"acc_norm_stderr\": 0.021355343028264053\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.648936170212766,\n \"acc_stderr\": 0.028473501272963758,\n \"acc_norm\": 0.648936170212766,\n \"acc_norm_stderr\": 0.028473501272963758\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6029986962190352,\n \"acc_stderr\": 0.012496346982909556,\n \"acc_norm\": 0.6029986962190352,\n \"acc_norm_stderr\": 0.012496346982909556\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.022161462608068522,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.022161462608068522\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8448979591836735,\n \"acc_stderr\": 0.0231747988612186,\n \"acc_norm\": 0.8448979591836735,\n \"acc_norm_stderr\": 0.0231747988612186\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4969400244798042,\n \"mc1_stderr\": 0.01750317326096062,\n \"mc2\": 0.6709148255495884,\n \"mc2_stderr\": 0.014645409374455808\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8437253354380426,\n \"acc_stderr\": 0.010205351791873492\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6626231993934799,\n \"acc_stderr\": 0.013023665136222096\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Bagel-Hermes-34B-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-18.562449.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_56_18.562449", "path": ["results_2024-01-14T01-56-18.562449.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-56-18.562449.parquet"]}]}]} | 2024-01-14T01:58:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp
Dataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-34B-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T01:56:18.562449(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-34B-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:56:18.562449(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-34B-Slerp\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-34B-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:56:18.562449(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0ed116fa8a6676b7d581035fe3928d625442ae6e |
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6304823287754658,
"acc_stderr": 0.03239962883986832,
"acc_norm": 0.6345924216801483,
"acc_norm_stderr": 0.033044077680253386,
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.6493174061433447,
"acc_norm_stderr": 0.013944635930726096
},
"harness|hellaswag|10": {
"acc": 0.6742680740888269,
"acc_stderr": 0.004676898861978916,
"acc_norm": 0.8580959968133838,
"acc_norm_stderr": 0.003482384956632782
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431353,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431353
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3318435754189944,
"acc_stderr": 0.015748421208187306,
"acc_norm": 0.3318435754189944,
"acc_norm_stderr": 0.015748421208187306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768914,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768914
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368032,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368032
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4589963280293758,
"mc1_stderr": 0.017444544447661192,
"mc2": 0.6377296280073737,
"mc2_stderr": 0.015266761289957081
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
},
"harness|gsm8k|5": {
"acc": 0.4488248673237301,
"acc_stderr": 0.01370015744278808
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo | [
"region:us"
] | 2024-01-14T01:58:34+00:00 | {"pretty_name": "Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/laser-dolphin-mixtral-4x7b-dpo](https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T01:56:15.562894](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__laser-dolphin-mixtral-4x7b-dpo/blob/main/results_2024-01-14T01-56-15.562894.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6304823287754658,\n \"acc_stderr\": 0.03239962883986832,\n \"acc_norm\": 0.6345924216801483,\n \"acc_norm_stderr\": 0.033044077680253386,\n \"mc1\": 0.4589963280293758,\n \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.6493174061433447,\n \"acc_norm_stderr\": 0.013944635930726096\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6742680740888269,\n \"acc_stderr\": 0.004676898861978916,\n \"acc_norm\": 0.8580959968133838,\n \"acc_norm_stderr\": 0.003482384956632782\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431353,\n \"acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431353\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3318435754189944,\n \"acc_stderr\": 0.015748421208187306,\n \"acc_norm\": 0.3318435754189944,\n \"acc_norm_stderr\": 0.015748421208187306\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.027962677604768914,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.027962677604768914\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368032,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368032\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4589963280293758,\n \"mc1_stderr\": 0.017444544447661192,\n \"mc2\": 0.6377296280073737,\n \"mc2_stderr\": 0.015266761289957081\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4488248673237301,\n \"acc_stderr\": 0.01370015744278808\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/laser-dolphin-mixtral-4x7b-dpo", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T01-56-15.562894.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T01_56_15.562894", "path": ["results_2024-01-14T01-56-15.562894.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T01-56-15.562894.parquet"]}]}]} | 2024-01-14T01:58:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo
Dataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-4x7b-dpo on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T01:56:15.562894(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-4x7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:56:15.562894(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/laser-dolphin-mixtral-4x7b-dpo\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/laser-dolphin-mixtral-4x7b-dpo on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T01:56:15.562894(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
95e0259118327b4bda5d80b334f3b1cd8e52e3a2 |
# Dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane)
This is the dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane), containing 43 images and their tags.
The core tags of this character are `long_hair, breasts, large_breasts, pink_hair, red_eyes, crown, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 86.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 43 | 40.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 108 | 87.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 43 | 70.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 108 | 138.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/clemenceau_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/clemenceau_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, white_gloves, black_skirt, looking_at_viewer, ponytail, visor_cap, cleavage, bare_shoulders, outdoors, thighs, brown_hair, holding, miniskirt, pencil_skirt, clothing_cutout, crop_top, earrings, sky, belt, cloud, golf_club, sleeveless_shirt |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, solo, black_gloves, black_dress, elbow_gloves, cleavage, fur_trim, jewelry, holding, smile, cape, simple_background, cross |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, black_gloves, looking_at_viewer, solo, black_dress, cape, fur_trim, hair_between_eyes, long_dress, pink_eyes, standing, braid, cleavage, closed_mouth, holding_staff, signature, simple_background, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_gloves | black_skirt | looking_at_viewer | ponytail | visor_cap | cleavage | bare_shoulders | outdoors | thighs | brown_hair | holding | miniskirt | pencil_skirt | clothing_cutout | crop_top | earrings | sky | belt | cloud | golf_club | sleeveless_shirt | black_gloves | black_dress | elbow_gloves | fur_trim | jewelry | smile | cape | simple_background | cross | hair_between_eyes | long_dress | pink_eyes | standing | braid | closed_mouth | holding_staff | signature | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------|:--------------------|:-----------|:------------|:-----------|:-----------------|:-----------|:---------|:-------------|:----------|:------------|:---------------|:------------------|:-----------|:-----------|:------|:-------|:--------|:------------|:-------------------|:---------------|:--------------|:---------------|:-----------|:----------|:--------|:-------|:--------------------|:--------|:--------------------|:-------------|:------------|:-----------|:--------|:---------------|:----------------|:------------|:-------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 10 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | | X | | | X | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 2 | 5 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | | | X | | | X | | | | | | | | | | | | | | | | X | X | | X | | X | X | X | | X | X | X | X | X | X | X | X | X |
| CyberHarem/clemenceau_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:20:44+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane)
=============================================
This is the dataset of clemenceau/クレマンソー/克莱蒙梭 (Azur Lane), containing 43 images and their tags.
The core tags of this character are 'long\_hair, breasts, large\_breasts, pink\_hair, red\_eyes, crown, bangs, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
686d368c78224b85fe3d8c3e197d1ca4e2a38a36 |
# Dataset of yorck/ヨルク/约克DE (Azur Lane)
This is the dataset of yorck/ヨルク/约克DE (Azur Lane), containing 51 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, white_hair, bangs, red_eyes, hat, black_headwear, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 51 | 97.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 51 | 45.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 135 | 104.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 51 | 80.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 135 | 160.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yorck_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yorck_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, cleavage, official_alternate_costume, black_dress, black_thighhighs, choker, thighs, horns, very_long_hair, bare_shoulders, sitting, smile, brown_thighhighs, blush, thigh_strap, closed_mouth, evening_gown |
| 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, solo, cleavage, black_gloves, bare_shoulders, black_dress, smile, blush, fishnets, iron_cross, earrings, military_hat, white_thighhighs, closed_mouth, simple_background, white_background, peaked_cap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | official_alternate_costume | black_dress | black_thighhighs | choker | thighs | horns | very_long_hair | bare_shoulders | sitting | smile | brown_thighhighs | blush | thigh_strap | closed_mouth | evening_gown | black_gloves | fishnets | iron_cross | earrings | military_hat | white_thighhighs | simple_background | white_background | peaked_cap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-----------------------------|:--------------|:-------------------|:---------|:---------|:--------|:-----------------|:-----------------|:----------|:--------|:-------------------|:--------|:--------------|:---------------|:---------------|:---------------|:-----------|:-------------|:-----------|:---------------|:-------------------|:--------------------|:-------------------|:-------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 19 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | | X | | | | | | X | | X | | X | | X | | X | X | X | X | X | X | X | X | X |
| CyberHarem/yorck_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:29:18+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of yorck/ヨルク/约克DE (Azur Lane)
=====================================
This is the dataset of yorck/ヨルク/约克DE (Azur Lane), containing 51 images and their tags.
The core tags of this character are 'breasts, long\_hair, large\_breasts, white\_hair, bangs, red\_eyes, hat, black\_headwear, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
0433d919af9554d231611ee3ea794a86fa4085d4 |
# Dataset of katsuragi/葛城/葛城 (Azur Lane)
This is the dataset of katsuragi/葛城/葛城 (Azur Lane), containing 41 images and their tags.
The core tags of this character are `breasts, long_hair, hair_ornament, twintails, small_breasts, blue_eyes, black_hair, earrings, bangs, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 41 | 58.87 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 41 | 33.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 90 | 65.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 41 | 51.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 90 | 91.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/katsuragi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/katsuragi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, jewelry, open_mouth, detached_sleeves, simple_background, white_thighhighs, blush, hairband, smile, white_background, blue_hair |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, fake_animal_ears, playboy_bunny, rabbit_ears, solo, looking_at_viewer, official_alternate_costume, rabbit_tail, red_leotard, black_gloves, black_pantyhose, hair_flower, covered_navel, open_mouth, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | jewelry | open_mouth | detached_sleeves | simple_background | white_thighhighs | blush | hairband | smile | white_background | blue_hair | fake_animal_ears | playboy_bunny | rabbit_ears | official_alternate_costume | rabbit_tail | red_leotard | black_gloves | black_pantyhose | hair_flower | covered_navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:----------|:-------------|:-------------------|:--------------------|:-------------------|:--------|:-----------|:--------|:-------------------|:------------|:-------------------|:----------------|:--------------|:-----------------------------|:--------------|:--------------|:---------------|:------------------|:--------------|:----------------|
| 0 | 29 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 1 | 7 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | X | | X | | | | | X | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/katsuragi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:24:39+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of katsuragi/葛城/葛城 (Azur Lane)
======================================
This is the dataset of katsuragi/葛城/葛城 (Azur Lane), containing 41 images and their tags.
The core tags of this character are 'breasts, long\_hair, hair\_ornament, twintails, small\_breasts, blue\_eyes, black\_hair, earrings, bangs, green\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b9363405f2840f3b0bf65d10552b122d07791c8e |
# Dataset of yumi/雪泉/雪泉 (Azur Lane)
This is the dataset of yumi/雪泉/雪泉 (Azur Lane), containing 500 images and their tags.
The core tags of this character are `breasts, blue_eyes, short_hair, bow, grey_hair, hair_bow, large_breasts, white_bow, medium_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 680.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 374.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1222 | 781.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 593.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1222 | 1.12 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yumi_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yumi_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, cleavage, looking_at_viewer, solo, white_background, collarbone, simple_background, blush, bare_shoulders, navel, smile, bangs, huge_breasts, blue_bikini |
| 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, off_shoulder, simple_background, solo, white_background, white_kimono, bangs, low_neckline, blush, open_mouth, shiny_skin, huge_breasts, shiny_hair |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, bare_shoulders, cleavage, collarbone, looking_at_viewer, off_shoulder, parted_bangs, solo, white_kimono, low_neckline, upper_body, blush, closed_mouth, smile, wide_sleeves, snowflakes |
| 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | 1girl, bare_shoulders, cleavage, kimono, looking_at_viewer, off_shoulder, solo, low_neckline, collarbone, folding_fan, huge_breasts, smile |
| 4 | 23 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | day, looking_at_viewer, cleavage, 1girl, outdoors, navel, smile, solo, blush, beach, ocean, blue_sky, cloud, blue_bikini, open_mouth, water, bare_shoulders, collarbone, side-tie_bikini_bottom |
| 5 | 12 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | 1boy, 1girl, blush, collarbone, hetero, solo_focus, nipples, paizuri, huge_breasts, penis, breasts_squeezed_together, open_mouth, bare_shoulders, looking_at_viewer, nude, smile, sweat, bangs, mosaic_censoring |
| 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | 1girl, cat_ears, looking_at_viewer, solo, navel, open_mouth, smile, blush, cat_tail, cleavage, simple_background, bare_shoulders, bell, white_background, cat_paws, gloves, white_panties |
| 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | 1girl, blush, looking_at_viewer, solo, open_mouth, simple_background, white_background, white_shirt, black_pantyhose, smile, black_hair, black_skirt, cleavage, long_sleeves, pencil_skirt |
| 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | 1girl, blush, hetero, huge_breasts, penis, pussy, sex, shiny_hair, spread_legs, vaginal, 1boy, shiny_skin, solo_focus, bar_censor, navel, nipples, nude, open_mouth, sweat, collarbone, kimono |
| 9 | 12 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | playboy_bunny, rabbit_ears, 1girl, fake_animal_ears, solo, detached_collar, looking_at_viewer, rabbit_tail, strapless_leotard, bare_shoulders, pantyhose, blush, cleavage, fishnets, white_background, white_leotard, simple_background, wrist_cuffs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | cleavage | looking_at_viewer | solo | white_background | collarbone | simple_background | blush | bare_shoulders | navel | smile | bangs | huge_breasts | blue_bikini | off_shoulder | white_kimono | low_neckline | open_mouth | shiny_skin | shiny_hair | parted_bangs | upper_body | closed_mouth | wide_sleeves | snowflakes | kimono | folding_fan | day | outdoors | beach | ocean | blue_sky | cloud | water | side-tie_bikini_bottom | 1boy | hetero | solo_focus | nipples | paizuri | penis | breasts_squeezed_together | nude | sweat | mosaic_censoring | cat_ears | cat_tail | bell | cat_paws | gloves | white_panties | white_shirt | black_pantyhose | black_hair | black_skirt | long_sleeves | pencil_skirt | pussy | sex | spread_legs | vaginal | bar_censor | playboy_bunny | rabbit_ears | fake_animal_ears | detached_collar | rabbit_tail | strapless_leotard | pantyhose | fishnets | white_leotard | wrist_cuffs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------------------|:-------|:-------------------|:-------------|:--------------------|:--------|:-----------------|:--------|:--------|:--------|:---------------|:--------------|:---------------|:---------------|:---------------|:-------------|:-------------|:-------------|:---------------|:-------------|:---------------|:---------------|:-------------|:---------|:--------------|:------|:-----------|:--------|:--------|:-----------|:--------|:--------|:-------------------------|:-------|:---------|:-------------|:----------|:----------|:--------|:----------------------------|:-------|:--------|:-------------------|:-----------|:-----------|:-------|:-----------|:---------|:----------------|:--------------|:------------------|:-------------|:--------------|:---------------|:---------------|:--------|:------|:--------------|:----------|:-------------|:----------------|:--------------|:-------------------|:------------------|:--------------|:--------------------|:------------|:-----------|:----------------|:--------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | X | X | X | X | X | X | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | X | | X | | X | X | | X | | | | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 | ![](samples/3/clu3-sample0.png) | ![](samples/3/clu3-sample1.png) | ![](samples/3/clu3-sample2.png) | ![](samples/3/clu3-sample3.png) | ![](samples/3/clu3-sample4.png) | X | X | X | X | | X | | | X | | X | | X | | X | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 23 | ![](samples/4/clu4-sample0.png) | ![](samples/4/clu4-sample1.png) | ![](samples/4/clu4-sample2.png) | ![](samples/4/clu4-sample3.png) | ![](samples/4/clu4-sample4.png) | X | X | X | X | | X | | X | X | X | X | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 12 | ![](samples/5/clu5-sample0.png) | ![](samples/5/clu5-sample1.png) | ![](samples/5/clu5-sample2.png) | ![](samples/5/clu5-sample3.png) | ![](samples/5/clu5-sample4.png) | X | | X | | | X | | X | X | | X | X | X | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 7 | ![](samples/6/clu6-sample0.png) | ![](samples/6/clu6-sample1.png) | ![](samples/6/clu6-sample2.png) | ![](samples/6/clu6-sample3.png) | ![](samples/6/clu6-sample4.png) | X | X | X | X | X | | X | X | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 | ![](samples/7/clu7-sample0.png) | ![](samples/7/clu7-sample1.png) | ![](samples/7/clu7-sample2.png) | ![](samples/7/clu7-sample3.png) | ![](samples/7/clu7-sample4.png) | X | X | X | X | X | | X | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 8 | 8 | ![](samples/8/clu8-sample0.png) | ![](samples/8/clu8-sample1.png) | ![](samples/8/clu8-sample2.png) | ![](samples/8/clu8-sample3.png) | ![](samples/8/clu8-sample4.png) | X | | | | | X | | X | | X | | | X | | | | | X | X | X | | | | | | X | | | | | | | | | | X | X | X | X | | X | | X | X | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | |
| 9 | 12 | ![](samples/9/clu9-sample0.png) | ![](samples/9/clu9-sample1.png) | ![](samples/9/clu9-sample2.png) | ![](samples/9/clu9-sample3.png) | ![](samples/9/clu9-sample4.png) | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/yumi_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:09:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:24:16+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of yumi/雪泉/雪泉 (Azur Lane)
=================================
This is the dataset of yumi/雪泉/雪泉 (Azur Lane), containing 500 images and their tags.
The core tags of this character are 'breasts, blue\_eyes, short\_hair, bow, grey\_hair, hair\_bow, large\_breasts, white\_bow, medium\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
06da3818ad180c58eaa3dc0ef6966c8a8a88e3e9 |
# Dataset of k31/K31/K31 (Girls' Frontline)
This is the dataset of k31/K31/K31 (Girls' Frontline), containing 18 images and their tags.
The core tags of this character are `hair_ornament, pink_hair, long_hair, purple_eyes, headphones, breasts, bangs, hair_between_eyes, hair_intakes, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 18 | 21.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 18 | 10.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 23.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 18 | 18.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 37.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/k31_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/k31_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, cleavage, holding, smile, looking_at_viewer, simple_background, white_background, blush, black_jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cleavage | holding | smile | looking_at_viewer | simple_background | white_background | blush | black_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:----------|:--------|:--------------------|:--------------------|:-------------------|:--------|:---------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/k31_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:01+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:37+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of k31/K31/K31 (Girls' Frontline)
=========================================
This is the dataset of k31/K31/K31 (Girls' Frontline), containing 18 images and their tags.
The core tags of this character are 'hair\_ornament, pink\_hair, long\_hair, purple\_eyes, headphones, breasts, bangs, hair\_between\_eyes, hair\_intakes, x\_hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9df828bf4e0a920bafc18f0deced4f7a373140b6 |
# Dataset of pp_19/PP-19/PP-19 (Girls' Frontline)
This is the dataset of pp_19/PP-19/PP-19 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blue_eyes, short_hair, white_hair, bangs, breasts, medium_breasts, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 13.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 8.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 19.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 23.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_19_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pp_19_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, gloves, fur_trim, gun, boots, holding_weapon, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | gloves | fur_trim | gun | boots | holding_weapon | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:---------|:-----------|:------|:--------|:-----------------|:-------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
| CyberHarem/pp_19_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:14+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pp\_19/PP-19/PP-19 (Girls' Frontline)
================================================
This is the dataset of pp\_19/PP-19/PP-19 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are 'blue\_eyes, short\_hair, white\_hair, bangs, breasts, medium\_breasts, blunt\_bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
991bf5e0f55f3955235c1f4e313d5b42d35ea967 |
# Dataset of pp_90/PP-90/PP-90 (Girls' Frontline)
This is the dataset of pp_90/PP-90/PP-90 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `twintails, drill_hair, grey_hair, red_eyes, twin_drills, bangs, hair_ornament, long_hair, ahoge, headphones, x_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 20.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 13.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 41 | 25.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 19.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 41 | 34.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pp_90_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pp_90_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, open_mouth, smile, white_shirt, black_gloves, looking_at_viewer, black_jacket, green_necktie, holding, long_sleeves, open_jacket, shorts, simple_background, blush, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | smile | white_shirt | black_gloves | looking_at_viewer | black_jacket | green_necktie | holding | long_sleeves | open_jacket | shorts | simple_background | blush | navel |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:--------|:--------------|:---------------|:--------------------|:---------------|:----------------|:----------|:---------------|:--------------|:---------|:--------------------|:--------|:--------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pp_90_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:05+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:30:50+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pp\_90/PP-90/PP-90 (Girls' Frontline)
================================================
This is the dataset of pp\_90/PP-90/PP-90 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are 'twintails, drill\_hair, grey\_hair, red\_eyes, twin\_drills, bangs, hair\_ornament, long\_hair, ahoge, headphones, x\_hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
322de340a1c21971b9a26609da8b537475e9ff1f |
# Dataset of qbz_191/QBZ-191/QBZ-191 (Girls' Frontline)
This is the dataset of qbz_191/QBZ-191/QBZ-191 (Girls' Frontline), containing 22 images and their tags.
The core tags of this character are `long_hair, bangs, black_hair, breasts, orange_eyes, medium_breasts, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 22 | 30.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 22 | 17.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 58 | 36.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 22 | 27.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 58 | 48.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/qbz_191_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/qbz_191_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, gloves, looking_at_viewer, white_dress, standing, black_thighhighs, holding_gun, assault_rifle, closed_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | gloves | looking_at_viewer | white_dress | standing | black_thighhighs | holding_gun | assault_rifle | closed_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------|:--------------------|:--------------|:-----------|:-------------------|:--------------|:----------------|:---------------|
| 0 | 22 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/qbz_191_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:33:12+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of qbz\_191/QBZ-191/QBZ-191 (Girls' Frontline)
======================================================
This is the dataset of qbz\_191/QBZ-191/QBZ-191 (Girls' Frontline), containing 22 images and their tags.
The core tags of this character are 'long\_hair, bangs, black\_hair, breasts, orange\_eyes, medium\_breasts, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6aec2e875db362b516ff497f6f61beed32dee37a |
# Dataset of p08/P08/P08 (Girls' Frontline)
This is the dataset of p08/P08/P08 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `short_hair, breasts, hat, brown_eyes, garrison_cap, medium_breasts, white_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 20.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 12.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 43 | 24.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 18.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 43 | 34.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p08_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p08_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, cleavage, long_sleeves, white_gloves, blue_jacket, boots, cropped_jacket, smile, thigh_strap, white_background, belt, black_leotard, blush, handgun, military_uniform, open_clothes, simple_background, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | cleavage | long_sleeves | white_gloves | blue_jacket | boots | cropped_jacket | smile | thigh_strap | white_background | belt | black_leotard | blush | handgun | military_uniform | open_clothes | simple_background | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:-----------|:---------------|:---------------|:--------------|:--------|:-----------------|:--------|:--------------|:-------------------|:-------|:----------------|:--------|:----------|:-------------------|:---------------|:--------------------|:-----------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/p08_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:11+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:47+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of p08/P08/P08 (Girls' Frontline)
=========================================
This is the dataset of p08/P08/P08 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are 'short\_hair, breasts, hat, brown\_eyes, garrison\_cap, medium\_breasts, white\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
f1dcc898be2617bcf68d9f5690de03e9d5041db2 |
# Dataset of t_5000/T-5000/T-5000 (Girls' Frontline)
This is the dataset of t_5000/T-5000/T-5000 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are `long_hair, red_hair, blue_eyes, breasts, hair_between_eyes, very_long_hair, bangs, medium_breasts, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 17.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 11.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 31 | 19.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 16.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 31 | 26.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/t_5000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/t_5000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, crop_top, looking_at_viewer, midriff, navel, fingerless_gloves, short_shorts, black_gloves, blush, thigh_strap, black_shirt, full_body, pouch, rifle, simple_background, socks, white_jacket, white_shorts, belt, bright_pupils, eyes_visible_through_hair, holding, single_thighhigh, standing, sweatdrop, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | crop_top | looking_at_viewer | midriff | navel | fingerless_gloves | short_shorts | black_gloves | blush | thigh_strap | black_shirt | full_body | pouch | rifle | simple_background | socks | white_jacket | white_shorts | belt | bright_pupils | eyes_visible_through_hair | holding | single_thighhigh | standing | sweatdrop | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:--------------------|:----------|:--------|:--------------------|:---------------|:---------------|:--------|:--------------|:--------------|:------------|:--------|:--------|:--------------------|:--------|:---------------|:---------------|:-------|:----------------|:----------------------------|:----------|:-------------------|:-----------|:------------|:-------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/t_5000_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:34+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of t\_5000/T-5000/T-5000 (Girls' Frontline)
===================================================
This is the dataset of t\_5000/T-5000/T-5000 (Girls' Frontline), containing 17 images and their tags.
The core tags of this character are 'long\_hair, red\_hair, blue\_eyes, breasts, hair\_between\_eyes, very\_long\_hair, bangs, medium\_breasts, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7adacea4af18483722f8aa06e313c25c82d12d36 |
# Dataset of ff_m249saw/FFM249SAW/M249SAW (Girls' Frontline)
This is the dataset of ff_m249saw/FFM249SAW/M249SAW (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are `blue_hair, long_hair, yellow_eyes, breasts, very_long_hair, large_breasts, bangs, eyewear_on_head, medium_breasts, sunglasses`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 29.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 16.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 44 | 34.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 26.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 44 | 51.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ff_m249saw_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ff_m249saw_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, bikini, blush, cleavage, solo, bubble_blowing, chewing_gum, collarbone, jacket, navel, fur_trim |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, crop_top, midriff, navel, bubble_blowing, fingerless_gloves, short_shorts, solo, chewing_gum, cowboy_shot, hood |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | bikini | blush | cleavage | solo | bubble_blowing | chewing_gum | collarbone | jacket | navel | fur_trim | crop_top | midriff | fingerless_gloves | short_shorts | cowboy_shot | hood |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:---------|:--------|:-----------|:-------|:-----------------|:--------------|:-------------|:---------|:--------|:-----------|:-----------|:----------|:--------------------|:---------------|:--------------|:-------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 5 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | | | X | X | X | | | X | | X | X | X | X | X | X |
| CyberHarem/ff_m249saw_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:27:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:31:48+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ff\_m249saw/FFM249SAW/M249SAW (Girls' Frontline)
===========================================================
This is the dataset of ff\_m249saw/FFM249SAW/M249SAW (Girls' Frontline), containing 19 images and their tags.
The core tags of this character are 'blue\_hair, long\_hair, yellow\_eyes, breasts, very\_long\_hair, large\_breasts, bangs, eyewear\_on\_head, medium\_breasts, sunglasses', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
27d49056dd5934d013a9cd75820d3b78b78a2296 |
# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T02:37:03.730641](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b/blob/main/results_2024-01-14T02-37-03.730641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.672153123323673,
"acc_stderr": 0.03128879331345752,
"acc_norm": 0.6725032879829345,
"acc_norm_stderr": 0.031933166428242975,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.642058591491927,
"mc2_stderr": 0.015391497190020965
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726099,
"acc_norm": 0.6843003412969283,
"acc_norm_stderr": 0.013582571095815291
},
"harness|hellaswag|10": {
"acc": 0.6778530173272257,
"acc_stderr": 0.004663439181149046,
"acc_norm": 0.8630750846444931,
"acc_norm_stderr": 0.0034306550069275825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099583,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099583
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154957,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154957
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8064516129032258,
"acc_stderr": 0.022475258525536057,
"acc_norm": 0.8064516129032258,
"acc_norm_stderr": 0.022475258525536057
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.022828881775249377,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.022828881775249377
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033446,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033446
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.024152225962801584,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.024152225962801584
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517934,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517934
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322626,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322626
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993445,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993445
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.01646981492840617,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.01646981492840617
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5283687943262412,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.5283687943262412,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5104302477183833,
"acc_stderr": 0.012767457253930648,
"acc_norm": 0.5104302477183833,
"acc_norm_stderr": 0.012767457253930648
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7683823529411765,
"acc_stderr": 0.025626533803777562,
"acc_norm": 0.7683823529411765,
"acc_norm_stderr": 0.025626533803777562
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.018718067052623227,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.018718067052623227
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7795918367346939,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.7795918367346939,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466108,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466108
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.642058591491927,
"mc2_stderr": 0.015391497190020965
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781074
},
"harness|gsm8k|5": {
"acc": 0.7103866565579985,
"acc_stderr": 0.01249392734865963
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b | [
"region:us"
] | 2024-01-14T02:39:24+00:00 | {"pretty_name": "Evaluation run of macadeliccc/SOLAR-math-2x10.7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [macadeliccc/SOLAR-math-2x10.7b](https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T02:37:03.730641](https://huggingface.co/datasets/open-llm-leaderboard/details_macadeliccc__SOLAR-math-2x10.7b/blob/main/results_2024-01-14T02-37-03.730641.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.672153123323673,\n \"acc_stderr\": 0.03128879331345752,\n \"acc_norm\": 0.6725032879829345,\n \"acc_norm_stderr\": 0.031933166428242975,\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.642058591491927,\n \"mc2_stderr\": 0.015391497190020965\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726099,\n \"acc_norm\": 0.6843003412969283,\n \"acc_norm_stderr\": 0.013582571095815291\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6778530173272257,\n \"acc_stderr\": 0.004663439181149046,\n \"acc_norm\": 0.8630750846444931,\n \"acc_norm_stderr\": 0.0034306550069275825\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n \"acc_stderr\": 0.04266763404099583,\n \"acc_norm\": 0.5777777777777777,\n \"acc_norm_stderr\": 0.04266763404099583\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154957,\n \"acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154957\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8064516129032258,\n \"acc_stderr\": 0.022475258525536057,\n \"acc_norm\": 0.8064516129032258,\n \"acc_norm_stderr\": 0.022475258525536057\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8838383838383839,\n \"acc_stderr\": 0.022828881775249377,\n \"acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.022828881775249377\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033446,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033446\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.024152225962801584,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.024152225962801584\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n \"acc_stderr\": 0.02093019318517934,\n \"acc_norm\": 0.8846153846153846,\n \"acc_norm_stderr\": 0.02093019318517934\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322626,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322626\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n \"acc_stderr\": 0.013625556907993445,\n \"acc_norm\": 0.8237547892720306,\n \"acc_norm_stderr\": 0.013625556907993445\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n \"acc_stderr\": 0.01646981492840617,\n \"acc_norm\": 0.4134078212290503,\n \"acc_norm_stderr\": 0.01646981492840617\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5283687943262412,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.5283687943262412,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5104302477183833,\n \"acc_stderr\": 0.012767457253930648,\n \"acc_norm\": 0.5104302477183833,\n \"acc_norm_stderr\": 0.012767457253930648\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7683823529411765,\n \"acc_stderr\": 0.025626533803777562,\n \"acc_norm\": 0.7683823529411765,\n \"acc_norm_stderr\": 0.025626533803777562\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.018718067052623227,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.018718067052623227\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7795918367346939,\n \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.7795918367346939,\n \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466108,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466108\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.642058591491927,\n \"mc2_stderr\": 0.015391497190020965\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781074\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7103866565579985,\n \"acc_stderr\": 0.01249392734865963\n }\n}\n```", "repo_url": "https://huggingface.co/macadeliccc/SOLAR-math-2x10.7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["**/details_harness|winogrande|5_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T02-37-03.730641.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T02_37_03.730641", "path": ["results_2024-01-14T02-37-03.730641.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T02-37-03.730641.parquet"]}]}]} | 2024-01-14T02:39:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b
Dataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T02:37:03.730641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T02:37:03.730641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of macadeliccc/SOLAR-math-2x10.7b\n\n\n\nDataset automatically created during the evaluation run of model macadeliccc/SOLAR-math-2x10.7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T02:37:03.730641(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
8c1b5ac7524acadf56210e567c964b0ad72eb6f4 |
# Dataset Card for Evaluation run of gagan3012/Multirial
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gagan3012__Multirial",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6087068516861436,
"acc_stderr": 0.032980911385021405,
"acc_norm": 0.6135781515215905,
"acc_norm_stderr": 0.03364558465127436,
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|arc:challenge|25": {
"acc": 0.5947098976109215,
"acc_stderr": 0.01434686906022933,
"acc_norm": 0.6322525597269625,
"acc_norm_stderr": 0.014090995618168478
},
"harness|hellaswag|10": {
"acc": 0.6061541525592511,
"acc_stderr": 0.0048760280379419405,
"acc_norm": 0.7956582354112727,
"acc_norm_stderr": 0.0040239573344619875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.02491524398598785,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.02491524398598785
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652458,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652458
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.01697028909045803,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.01697028909045803
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.0286265479124374,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.0286265479124374
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6932515337423313,
"acc_stderr": 0.03623089915724147,
"acc_norm": 0.6932515337423313,
"acc_norm_stderr": 0.03623089915724147
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597542,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597542
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.01492744710193716,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.01492744710193716
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069706,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069706
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44589308996088656,
"acc_stderr": 0.012695244711379772,
"acc_norm": 0.44589308996088656,
"acc_norm_stderr": 0.012695244711379772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.01955964680921593,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.01955964680921593
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37576499388004897,
"mc1_stderr": 0.016954584060214297,
"mc2": 0.5469648449991642,
"mc2_stderr": 0.01540322430997804
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855576
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721716
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_gagan3012__Multirial | [
"region:us"
] | 2024-01-14T02:40:30+00:00 | {"pretty_name": "Evaluation run of gagan3012/Multirial", "dataset_summary": "Dataset automatically created during the evaluation run of model [gagan3012/Multirial](https://huggingface.co/gagan3012/Multirial) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gagan3012__Multirial\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T02:38:13.132787](https://huggingface.co/datasets/open-llm-leaderboard/details_gagan3012__Multirial/blob/main/results_2024-01-14T02-38-13.132787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6087068516861436,\n \"acc_stderr\": 0.032980911385021405,\n \"acc_norm\": 0.6135781515215905,\n \"acc_norm_stderr\": 0.03364558465127436,\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5947098976109215,\n \"acc_stderr\": 0.01434686906022933,\n \"acc_norm\": 0.6322525597269625,\n \"acc_norm_stderr\": 0.014090995618168478\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6061541525592511,\n \"acc_stderr\": 0.0048760280379419405,\n \"acc_norm\": 0.7956582354112727,\n \"acc_norm_stderr\": 0.0040239573344619875\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099834,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099834\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n \"acc_stderr\": 0.02698528957655274,\n \"acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.02698528957655274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.02491524398598785,\n \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.02491524398598785\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652458,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652458\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8055045871559633,\n \"acc_stderr\": 0.01697028909045803,\n \"acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.01697028909045803\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7892156862745098,\n \"acc_stderr\": 0.0286265479124374,\n \"acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.0286265479124374\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6932515337423313,\n \"acc_stderr\": 0.03623089915724147,\n \"acc_norm\": 0.6932515337423313,\n \"acc_norm_stderr\": 0.03623089915724147\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597542,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597542\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.01492744710193716,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.01492744710193716\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069706,\n \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069706\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44589308996088656,\n \"acc_stderr\": 0.012695244711379772,\n \"acc_norm\": 0.44589308996088656,\n \"acc_norm_stderr\": 0.012695244711379772\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6274509803921569,\n \"acc_stderr\": 0.01955964680921593,\n \"acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.01955964680921593\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.7114427860696517,\n \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37576499388004897,\n \"mc1_stderr\": 0.016954584060214297,\n \"mc2\": 0.5469648449991642,\n \"mc2_stderr\": 0.01540322430997804\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855576\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \"acc_stderr\": 0.013516752972721716\n }\n}\n```", "repo_url": "https://huggingface.co/gagan3012/Multirial", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T02-38-13.132787.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T02_38_13.132787", "path": ["results_2024-01-14T02-38-13.132787.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T02-38-13.132787.parquet"]}]}]} | 2024-01-14T02:40:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of gagan3012/Multirial
Dataset automatically created during the evaluation run of model gagan3012/Multirial on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T02:38:13.132787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of gagan3012/Multirial\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/Multirial on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T02:38:13.132787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of gagan3012/Multirial\n\n\n\nDataset automatically created during the evaluation run of model gagan3012/Multirial on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T02:38:13.132787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b9993eca96a202eea3e9d8f77eca151d4c252fed |
# Dataset of a_91/A-91/A-91 (Girls' Frontline)
This is the dataset of a_91/A-91/A-91 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `blonde_hair, long_hair, breasts, yellow_eyes, hair_between_eyes, mole, mole_under_eye, bangs, large_breasts, hat, medium_breasts, multicolored_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 29.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 15.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 46 | 29.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 24.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 46 | 42.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_91_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a_91_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, smile, looking_at_viewer, solo, gloves, open_mouth, black_bodysuit, holding, cleavage, drunk |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | gloves | open_mouth | black_bodysuit | holding | cleavage | drunk |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:---------|:-------------|:-----------------|:----------|:-----------|:--------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/a_91_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:02+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:59:23+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of a\_91/A-91/A-91 (Girls' Frontline)
=============================================
This is the dataset of a\_91/A-91/A-91 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are 'blonde\_hair, long\_hair, breasts, yellow\_eyes, hair\_between\_eyes, mole, mole\_under\_eye, bangs, large\_breasts, hat, medium\_breasts, multicolored\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
1ab7ff4f8dca159b31f1fd54c8437302be50679b |
# Dataset of fx_05/FX-05/FX-05 (Girls' Frontline)
This is the dataset of fx_05/FX-05/FX-05 (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are `blue_eyes, long_hair, breasts, large_breasts, grey_hair, hat, bangs, very_long_hair, black_headwear, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 14 | 18.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 14 | 11.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 22.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 14 | 17.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 31.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fx_05_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fx_05_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, pantyhose, jewelry, holding, smile, assault_rifle, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | pantyhose | jewelry | holding | smile | assault_rifle | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:------------|:----------|:----------|:--------|:----------------|:---------|
| 0 | 14 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
| CyberHarem/fx_05_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:07+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:00:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fx\_05/FX-05/FX-05 (Girls' Frontline)
================================================
This is the dataset of fx\_05/FX-05/FX-05 (Girls' Frontline), containing 14 images and their tags.
The core tags of this character are 'blue\_eyes, long\_hair, breasts, large\_breasts, grey\_hair, hat, bangs, very\_long\_hair, black\_headwear, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
6f13738d010c8a3705dc75b22c5e0c419f5125fc |
# Dataset of ads/ADS/ADS (Girls' Frontline)
This is the dataset of ads/ADS/ADS (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `blue_eyes, blue_hair, long_hair, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 20.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 10.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 20.69 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 17.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 30.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ads_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ads_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, white_gloves, barefoot, blue_dress, puffy_short_sleeves, full_body, see-through, white_background, assault_rifle, closed_mouth, holding, simple_background, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | white_gloves | barefoot | blue_dress | puffy_short_sleeves | full_body | see-through | white_background | assault_rifle | closed_mouth | holding | simple_background | white_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:---------------|:-----------|:-------------|:----------------------|:------------|:--------------|:-------------------|:----------------|:---------------|:----------|:--------------------|:--------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ads_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:55:13+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T02:58:56+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ads/ADS/ADS (Girls' Frontline)
=========================================
This is the dataset of ads/ADS/ADS (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are 'blue\_eyes, blue\_hair, long\_hair, hairband', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b7bd5749fa80cb8d90c3769093f58792a4199cd3 |
# Dataset of hatsushimo/初霜/初霜 (Azur Lane)
This is the dataset of hatsushimo/初霜/初霜 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `animal_ears, hair_ornament, pink_hair, animal_ear_fluff, cat_ears, hairclip, red_eyes, ahoge, bangs, hair_between_eyes, cat_tail, fang, long_hair, tail, breasts, twintails, cat_girl, small_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 16.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 10.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 21.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 14.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 29.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsushimo_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatsushimo_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, looking_at_viewer, solo, open_mouth, :d, bare_shoulders, kimono, long_sleeves, thighhighs, wide_sleeves, choker, jingle_bell, cleavage, collarbone, garter_straps, obi, pleated_skirt, simple_background, underwear, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | looking_at_viewer | solo | open_mouth | :d | bare_shoulders | kimono | long_sleeves | thighhighs | wide_sleeves | choker | jingle_bell | cleavage | collarbone | garter_straps | obi | pleated_skirt | simple_background | underwear | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:-------------|:-----|:-----------------|:---------|:---------------|:-------------|:---------------|:---------|:--------------|:-----------|:-------------|:----------------|:------|:----------------|:--------------------|:------------|:-------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hatsushimo_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:47+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:30:09+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hatsushimo/初霜/初霜 (Azur Lane)
=======================================
This is the dataset of hatsushimo/初霜/初霜 (Azur Lane), containing 11 images and their tags.
The core tags of this character are 'animal\_ears, hair\_ornament, pink\_hair, animal\_ear\_fluff, cat\_ears, hairclip, red\_eyes, ahoge, bangs, hair\_between\_eyes, cat\_tail, fang, long\_hair, tail, breasts, twintails, cat\_girl, small\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
40c3d282435bcea229d7bf22183c81146630cd58 |
# Dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane)
This is the dataset of georg_thiele/ゲオルク・ティーレ/Z2 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `bangs, red_eyes, long_hair, braid, black_hair, brown_hair, hat, beret, bow, hair_bun, red_bow, single_hair_bun, single_side_bun`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 12.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 7.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 18 | 13.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 10.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 18 | 18.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/georg_thiele_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/georg_thiele_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, full_body, long_sleeves, obi, closed_mouth, simple_background, sitting, standing, white_background, wide_sleeves, barefoot, black_footwear, boots, candy_apple, holding_food, jacket, striped_kimono, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | full_body | long_sleeves | obi | closed_mouth | simple_background | sitting | standing | white_background | wide_sleeves | barefoot | black_footwear | boots | candy_apple | holding_food | jacket | striped_kimono | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:------------|:---------------|:------|:---------------|:--------------------|:----------|:-----------|:-------------------|:---------------|:-----------|:-----------------|:--------|:--------------|:---------------|:---------|:-----------------|:---------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/georg_thiele_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:53+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:09:17+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of georg\_thiele/ゲオルク・ティーレ/Z2 (Azur Lane)
=================================================
This is the dataset of georg\_thiele/ゲオルク・ティーレ/Z2 (Azur Lane), containing 13 images and their tags.
The core tags of this character are 'bangs, red\_eyes, long\_hair, braid, black\_hair, brown\_hair, hat, beret, bow, hair\_bun, red\_bow, single\_hair\_bun, single\_side\_bun', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ae9fa160b4d65a8b6b0c50b6d9a50ae490e0bacd |
# Dataset of ise/伊勢/伊势 (Azur Lane)
This is the dataset of ise/伊勢/伊势 (Azur Lane), containing 11 images and their tags.
The core tags of this character are `animal_ears, breasts, fox_ears, red_hair, fox_tail, tail, hair_ornament, ponytail, bangs, large_breasts, long_hair, medium_breasts, red_eyes, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 8.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 23 | 15.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 12.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 23 | 24.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ise_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ise_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, midriff, navel, smile, cleavage, fingerless_gloves, hakama_skirt, simple_background, collarbone, black_gloves, full_body, standing, sword, white_background, detached_sleeves, hip_vent, holding_weapon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | midriff | navel | smile | cleavage | fingerless_gloves | hakama_skirt | simple_background | collarbone | black_gloves | full_body | standing | sword | white_background | detached_sleeves | hip_vent | holding_weapon |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:----------|:--------|:--------|:-----------|:--------------------|:---------------|:--------------------|:-------------|:---------------|:------------|:-----------|:--------|:-------------------|:-------------------|:-----------|:-----------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/ise_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T02:57:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:01:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ise/伊勢/伊势 (Azur Lane)
================================
This is the dataset of ise/伊勢/伊势 (Azur Lane), containing 11 images and their tags.
The core tags of this character are 'animal\_ears, breasts, fox\_ears, red\_hair, fox\_tail, tail, hair\_ornament, ponytail, bangs, large\_breasts, long\_hair, medium\_breasts, red\_eyes, hair\_between\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
7c7fd84cba2d68d4235219842a9e1f4c9c324100 |
# Dataset of laffey_ii/ラフィーII/拉菲II (Azur Lane)
This is the dataset of laffey_ii/ラフィーII/拉菲II (Azur Lane), containing 34 images and their tags.
The core tags of this character are `long_hair, twintails, white_hair, rabbit_ears, red_eyes, animal_ears, bangs, hairband, fake_animal_ears, very_long_hair, breasts, hair_between_eyes, ribbon, small_breasts, hair_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 34 | 59.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 34 | 26.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 95 | 66.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 34 | 49.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 95 | 102.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/laffey_ii_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/laffey_ii_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, bare_shoulders, looking_at_viewer, blush, white_thighhighs, long_sleeves, white_dress, off_shoulder, collarbone, parted_lips, simple_background, sleeves_past_fingers |
| 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, looking_at_viewer, playboy_bunny, solo, white_pantyhose, cup, full_body, official_alternate_costume, strapless_leotard, blue_leotard, blush, holding_tray, no_shoes |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | bare_shoulders | looking_at_viewer | blush | white_thighhighs | long_sleeves | white_dress | off_shoulder | collarbone | parted_lips | simple_background | sleeves_past_fingers | playboy_bunny | white_pantyhose | cup | full_body | official_alternate_costume | strapless_leotard | blue_leotard | holding_tray | no_shoes |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:--------------------|:--------|:-------------------|:---------------|:--------------|:---------------|:-------------|:--------------|:--------------------|:-----------------------|:----------------|:------------------|:------|:------------|:-----------------------------|:--------------------|:---------------|:---------------|:-----------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 12 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X |
| CyberHarem/laffey_ii_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:03:09+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:13:55+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of laffey\_ii/ラフィーII/拉菲II (Azur Lane)
=============================================
This is the dataset of laffey\_ii/ラフィーII/拉菲II (Azur Lane), containing 34 images and their tags.
The core tags of this character are 'long\_hair, twintails, white\_hair, rabbit\_ears, red\_eyes, animal\_ears, bangs, hairband, fake\_animal\_ears, very\_long\_hair, breasts, hair\_between\_eyes, ribbon, small\_breasts, hair\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a75789e4090380407b6ada0f8d443eb5e7a1300a | # Data Format(s)
The lichess dataset with 16M chess games was used. These games were transcoded into UCI notation, with the minor modification that the BOS token (`;`) is added to every game and the EOS token `#` is added whenever there's a checkmate.
# Character-based encoding vocab
Tokenization is simplified by using a vocabulary with 23 characters in the following order:
```
[' ', '1', '2', '3', '4', '5', '6', '7', '8', ';', '#', 'a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'n', 'r', 'q', 'k']
```
With the exception of `'b'`, all other tokens have a unique purpose in the dataset.
This vocab also makes intuitive the encoding/decoding of board squares for human readers. Games always begin with a 9. Columns (a through h) are encoded to 11 through 18. Rows are encoded to their integer values 1 through 8. So a move `e2e4` becomes `[15, 2, 15, 4]`, and a move `g1f3` becomes `[17, 1, 16, 3]`. Which is convenient because the '7' in the 17 and the '6' from the 16 correspond to the 7th and 6th columns respectively. Likewise, breaks between moves are encoded as `0` and checkmate is encoded as `10`. So the sequence `b1b8 a8b8#` becomes `[12,1,12,8,0,11,8,12,8,10]` | austindavis/chess_mi | [
"task_categories:text-generation",
"size_categories:10M<n<100M",
"region:us"
] | 2024-01-14T03:10:44+00:00 | {"size_categories": ["10M<n<100M"], "task_categories": ["text-generation"], "pretty_name": "Chess Mech Interp"} | 2024-01-15T08:52:07+00:00 | [] | [] | TAGS
#task_categories-text-generation #size_categories-10M<n<100M #region-us
| # Data Format(s)
The lichess dataset with 16M chess games was used. These games were transcoded into UCI notation, with the minor modification that the BOS token (';') is added to every game and the EOS token '#' is added whenever there's a checkmate.
# Character-based encoding vocab
Tokenization is simplified by using a vocabulary with 23 characters in the following order:
With the exception of ''b'', all other tokens have a unique purpose in the dataset.
This vocab also makes intuitive the encoding/decoding of board squares for human readers. Games always begin with a 9. Columns (a through h) are encoded to 11 through 18. Rows are encoded to their integer values 1 through 8. So a move 'e2e4' becomes '[15, 2, 15, 4]', and a move 'g1f3' becomes '[17, 1, 16, 3]'. Which is convenient because the '7' in the 17 and the '6' from the 16 correspond to the 7th and 6th columns respectively. Likewise, breaks between moves are encoded as '0' and checkmate is encoded as '10'. So the sequence 'b1b8 a8b8#' becomes '[12,1,12,8,0,11,8,12,8,10]' | [
"# Data Format(s)\nThe lichess dataset with 16M chess games was used. These games were transcoded into UCI notation, with the minor modification that the BOS token (';') is added to every game and the EOS token '#' is added whenever there's a checkmate.",
"# Character-based encoding vocab\nTokenization is simplified by using a vocabulary with 23 characters in the following order:\n\nWith the exception of ''b'', all other tokens have a unique purpose in the dataset.\nThis vocab also makes intuitive the encoding/decoding of board squares for human readers. Games always begin with a 9. Columns (a through h) are encoded to 11 through 18. Rows are encoded to their integer values 1 through 8. So a move 'e2e4' becomes '[15, 2, 15, 4]', and a move 'g1f3' becomes '[17, 1, 16, 3]'. Which is convenient because the '7' in the 17 and the '6' from the 16 correspond to the 7th and 6th columns respectively. Likewise, breaks between moves are encoded as '0' and checkmate is encoded as '10'. So the sequence 'b1b8 a8b8#' becomes '[12,1,12,8,0,11,8,12,8,10]'"
] | [
"TAGS\n#task_categories-text-generation #size_categories-10M<n<100M #region-us \n",
"# Data Format(s)\nThe lichess dataset with 16M chess games was used. These games were transcoded into UCI notation, with the minor modification that the BOS token (';') is added to every game and the EOS token '#' is added whenever there's a checkmate.",
"# Character-based encoding vocab\nTokenization is simplified by using a vocabulary with 23 characters in the following order:\n\nWith the exception of ''b'', all other tokens have a unique purpose in the dataset.\nThis vocab also makes intuitive the encoding/decoding of board squares for human readers. Games always begin with a 9. Columns (a through h) are encoded to 11 through 18. Rows are encoded to their integer values 1 through 8. So a move 'e2e4' becomes '[15, 2, 15, 4]', and a move 'g1f3' becomes '[17, 1, 16, 3]'. Which is convenient because the '7' in the 17 and the '6' from the 16 correspond to the 7th and 6th columns respectively. Likewise, breaks between moves are encoded as '0' and checkmate is encoded as '10'. So the sequence 'b1b8 a8b8#' becomes '[12,1,12,8,0,11,8,12,8,10]'"
] |
fd55eb4a59f94c4e20d04a7efa858cafc3a3c2c3 |
# Dataset of m38/M38/伯莱塔38型 (Girls' Frontline)
This is the dataset of m38/M38/伯莱塔38型 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `blue_eyes, long_hair, ahoge, hat, bangs, beret, hair_ornament, brown_hair, hairclip, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 13.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 13.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 20.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/m38_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/m38_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, simple_background, solo, white_shirt, long_sleeves, pleated_skirt, submachine_gun, white_background, white_thighhighs, black_footwear, black_skirt, closed_mouth, holding_gun, jacket, military_uniform, red_necktie, loafers, belt, blush, collared_shirt, full_body, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | simple_background | solo | white_shirt | long_sleeves | pleated_skirt | submachine_gun | white_background | white_thighhighs | black_footwear | black_skirt | closed_mouth | holding_gun | jacket | military_uniform | red_necktie | loafers | belt | blush | collared_shirt | full_body | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:--------------|:---------------|:----------------|:-----------------|:-------------------|:-------------------|:-----------------|:--------------|:---------------|:--------------|:---------|:-------------------|:--------------|:----------|:-------|:--------|:-----------------|:------------|:-----------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/m38_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:26:41+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of m38/M38/伯莱塔38型 (Girls' Frontline)
============================================
This is the dataset of m38/M38/伯莱塔38型 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are 'blue\_eyes, long\_hair, ahoge, hat, bangs, beret, hair\_ornament, brown\_hair, hairclip, breasts, hair\_between\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
dc00c3ed78f3d258fa32c6d169a032c5670188f2 |
# Dataset of js05/JS05/JS05 (Girls' Frontline)
This is the dataset of js05/JS05/JS05 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are `short_hair, green_eyes, grey_hair, bangs, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 14.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 9.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 18.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 14.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 25.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/js05_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/js05_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, black_gloves, looking_at_viewer, simple_background, fingerless_gloves, closed_mouth, jewelry, smile, white_background, bare_shoulders, choker, elbow_gloves, holding, skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | looking_at_viewer | simple_background | fingerless_gloves | closed_mouth | jewelry | smile | white_background | bare_shoulders | choker | elbow_gloves | holding | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------------------|:--------------------|:---------------|:----------|:--------|:-------------------|:-----------------|:---------|:---------------|:----------|:--------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/js05_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:22+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:25:43+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of js05/JS05/JS05 (Girls' Frontline)
============================================
This is the dataset of js05/JS05/JS05 (Girls' Frontline), containing 13 images and their tags.
The core tags of this character are 'short\_hair, green\_eyes, grey\_hair, bangs, earrings', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5baa6e4e0b323b5e4d81f30614eb538e96a02375 |
# Dataset of f1/F1/F1 (Girls' Frontline)
This is the dataset of f1/F1/F1 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are `hat, blue_eyes, brown_hair, long_hair, twintails`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 10 | 10.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 10 | 6.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 22 | 13.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 10 | 10.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 22 | 18.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/f1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/f1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, open_mouth, holding, looking_at_viewer, boots, fingerless_gloves, scarf, :d, rifle, shirt, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | open_mouth | holding | looking_at_viewer | boots | fingerless_gloves | scarf | :d | rifle | shirt | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------------|:----------|:--------------------|:--------|:--------------------|:--------|:-----|:--------|:--------|:--------------------|
| 0 | 10 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/f1_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:22:27+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:25:29+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of f1/F1/F1 (Girls' Frontline)
======================================
This is the dataset of f1/F1/F1 (Girls' Frontline), containing 10 images and their tags.
The core tags of this character are 'hat, blue\_eyes, brown\_hair, long\_hair, twintails', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
909e14747ebbc746972e70564f08b782d88874bd |
# Dataset Card for Evaluation run of jefferylovely/AthenaImaniMaven
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/AthenaImaniMaven](https://huggingface.co/jefferylovely/AthenaImaniMaven) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T03:41:28.738425](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven/blob/main/results_2024-01-14T03-41-28.738425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5906347296238645,
"acc_stderr": 0.03376761541575429,
"acc_norm": 0.5954572418779962,
"acc_norm_stderr": 0.03446646542818655,
"mc1": 0.42105263157894735,
"mc1_stderr": 0.01728393624813649,
"mc2": 0.5857820006375237,
"mc2_stderr": 0.015441927798310004
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449698,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759084
},
"harness|hellaswag|10": {
"acc": 0.6552479585739892,
"acc_stderr": 0.004743160034271149,
"acc_norm": 0.8465445130452102,
"acc_norm_stderr": 0.0035968938961909148
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6452830188679245,
"acc_stderr": 0.029445175328199586,
"acc_norm": 0.6452830188679245,
"acc_norm_stderr": 0.029445175328199586
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6597222222222222,
"acc_stderr": 0.039621355734862175,
"acc_norm": 0.6597222222222222,
"acc_norm_stderr": 0.039621355734862175
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.04966570903978529,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.04966570903978529
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6774193548387096,
"acc_stderr": 0.02659308451657228,
"acc_norm": 0.6774193548387096,
"acc_norm_stderr": 0.02659308451657228
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.028408953626245282,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.028408953626245282
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130952,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130952
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652456,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652456
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7853211009174312,
"acc_stderr": 0.01760430414925648,
"acc_norm": 0.7853211009174312,
"acc_norm_stderr": 0.01760430414925648
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422872,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422872
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.04691521224077742,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.04691521224077742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7611749680715197,
"acc_stderr": 0.015246803197398677,
"acc_norm": 0.7611749680715197,
"acc_norm_stderr": 0.015246803197398677
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977243,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977243
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.015235075776719613,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.015235075776719613
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144363,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144363
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799015,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105932,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105932
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731572,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731572
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6653061224489796,
"acc_stderr": 0.030209235226242307,
"acc_norm": 0.6653061224489796,
"acc_norm_stderr": 0.030209235226242307
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.027403859410786848,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.027403859410786848
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.78,
"acc_stderr": 0.041633319989322605,
"acc_norm": 0.78,
"acc_norm_stderr": 0.041633319989322605
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764003,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764003
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42105263157894735,
"mc1_stderr": 0.01728393624813649,
"mc2": 0.5857820006375237,
"mc2_stderr": 0.015441927798310004
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663595
},
"harness|gsm8k|5": {
"acc": 0.3502653525398029,
"acc_stderr": 0.01314040945557127
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven | [
"region:us"
] | 2024-01-14T03:23:58+00:00 | {"pretty_name": "Evaluation run of jefferylovely/AthenaImaniMaven", "dataset_summary": "Dataset automatically created during the evaluation run of model [jefferylovely/AthenaImaniMaven](https://huggingface.co/jefferylovely/AthenaImaniMaven) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T03:41:28.738425](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__AthenaImaniMaven/blob/main/results_2024-01-14T03-41-28.738425.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5906347296238645,\n \"acc_stderr\": 0.03376761541575429,\n \"acc_norm\": 0.5954572418779962,\n \"acc_norm_stderr\": 0.03446646542818655,\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.01728393624813649,\n \"mc2\": 0.5857820006375237,\n \"mc2_stderr\": 0.015441927798310004\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449698,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759084\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6552479585739892,\n \"acc_stderr\": 0.004743160034271149,\n \"acc_norm\": 0.8465445130452102,\n \"acc_norm_stderr\": 0.0035968938961909148\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6452830188679245,\n \"acc_stderr\": 0.029445175328199586,\n \"acc_norm\": 0.6452830188679245,\n \"acc_norm_stderr\": 0.029445175328199586\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6597222222222222,\n \"acc_stderr\": 0.039621355734862175,\n \"acc_norm\": 0.6597222222222222,\n \"acc_norm_stderr\": 0.039621355734862175\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.04966570903978529,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.04966570903978529\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6774193548387096,\n \"acc_stderr\": 0.02659308451657228,\n \"acc_norm\": 0.6774193548387096,\n \"acc_norm_stderr\": 0.02659308451657228\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.028408953626245282,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.028408953626245282\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130952,\n \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130952\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652456,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652456\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7853211009174312,\n \"acc_stderr\": 0.01760430414925648,\n \"acc_norm\": 0.7853211009174312,\n \"acc_norm_stderr\": 0.01760430414925648\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422872,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422872\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.04691521224077742,\n \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.04691521224077742\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7611749680715197,\n \"acc_stderr\": 0.015246803197398677,\n \"acc_norm\": 0.7611749680715197,\n \"acc_norm_stderr\": 0.015246803197398677\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977243,\n \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977243\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n \"acc_stderr\": 0.015235075776719613,\n \"acc_norm\": 0.293854748603352,\n \"acc_norm_stderr\": 0.015235075776719613\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144363,\n \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144363\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n \"acc_stderr\": 0.012573836633799015,\n \"acc_norm\": 0.41264667535853977,\n \"acc_norm_stderr\": 0.012573836633799015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105932,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105932\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n \"acc_stderr\": 0.04724577405731572,\n \"acc_norm\": 0.5818181818181818,\n \"acc_norm_stderr\": 0.04724577405731572\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6653061224489796,\n \"acc_stderr\": 0.030209235226242307,\n \"acc_norm\": 0.6653061224489796,\n \"acc_norm_stderr\": 0.030209235226242307\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n \"acc_stderr\": 0.027403859410786848,\n \"acc_norm\": 0.8159203980099502,\n \"acc_norm_stderr\": 0.027403859410786848\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.041633319989322605,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.041633319989322605\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764003,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764003\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42105263157894735,\n \"mc1_stderr\": 0.01728393624813649,\n \"mc2\": 0.5857820006375237,\n \"mc2_stderr\": 0.015441927798310004\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663595\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3502653525398029,\n \"acc_stderr\": 0.01314040945557127\n }\n}\n```", "repo_url": "https://huggingface.co/jefferylovely/AthenaImaniMaven", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-21-32.157923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["**/details_harness|winogrande|5_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["**/details_harness|winogrande|5_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T03-41-28.738425.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T03_21_32.157923", "path": ["results_2024-01-14T03-21-32.157923.parquet"]}, {"split": "2024_01_14T03_41_28.738425", "path": ["results_2024-01-14T03-41-28.738425.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T03-41-28.738425.parquet"]}]}]} | 2024-01-14T03:44:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jefferylovely/AthenaImaniMaven
Dataset automatically created during the evaluation run of model jefferylovely/AthenaImaniMaven on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T03:41:28.738425(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jefferylovely/AthenaImaniMaven\n\n\n\nDataset automatically created during the evaluation run of model jefferylovely/AthenaImaniMaven on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T03:41:28.738425(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jefferylovely/AthenaImaniMaven\n\n\n\nDataset automatically created during the evaluation run of model jefferylovely/AthenaImaniMaven on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T03:41:28.738425(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b8f3f70b1786faf553c058a987bddd31d3f992a6 |
# Dataset of felix_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane)
This is the dataset of felix_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane), containing 26 images and their tags.
The core tags of this character are `long_hair, purple_hair, red_eyes, twintails, breasts, very_long_hair, bangs, small_breasts, horns`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 26 | 58.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 26 | 26.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 63 | 54.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 26 | 48.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 63 | 87.56 MiB | [Download](https://huggingface.co/datasets/CyberHarem/felix_schultz_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/felix_schultz_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 26 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | looking_at_viewer, 1girl, solo, bare_shoulders, elbow_gloves, navel, arms_up, open_mouth, revealing_clothes, armpits, black_gloves, blush, smile, thighs, black_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | 1girl | solo | bare_shoulders | elbow_gloves | navel | arms_up | open_mouth | revealing_clothes | armpits | black_gloves | blush | smile | thighs | black_dress |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:--------|:-------|:-----------------|:---------------|:--------|:----------|:-------------|:--------------------|:----------|:---------------|:--------|:--------|:---------|:--------------|
| 0 | 26 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/felix_schultz_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:25:15+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:32:10+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of felix\_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane)
==========================================================
This is the dataset of felix\_schultz/フィリックス・シュルツ/菲利克斯·舒尔茨 (Azur Lane), containing 26 images and their tags.
The core tags of this character are 'long\_hair, purple\_hair, red\_eyes, twintails, breasts, very\_long\_hair, bangs, small\_breasts, horns', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ca7d5ae4638dcc3da858303901586f6f35d8f09c |
# Dataset of maryland/メリーランド/马里兰 (Azur Lane)
This is the dataset of maryland/メリーランド/马里兰 (Azur Lane), containing 16 images and their tags.
The core tags of this character are `long_hair, ponytail, red_eyes, red_hair, breasts, large_breasts, bangs, hair_between_eyes, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 16 | 15.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 16 | 9.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 18.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 16 | 14.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 25.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/maryland_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/maryland_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, smile, solo, black_gloves, dress, thighhighs, cleavage, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | solo | black_gloves | dress | thighhighs | cleavage | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------|:---------------|:--------|:-------------|:-----------|:--------------|
| 0 | 16 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X |
| CyberHarem/maryland_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:25:16+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:30:46+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of maryland/メリーランド/马里兰 (Azur Lane)
==========================================
This is the dataset of maryland/メリーランド/马里兰 (Azur Lane), containing 16 images and their tags.
The core tags of this character are 'long\_hair, ponytail, red\_eyes, red\_hair, breasts, large\_breasts, bangs, hair\_between\_eyes, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ebd5b848efed4a654558e79103eb0a42a6efa945 |
# Dataset of p22/P22/P22 (Girls' Frontline)
This is the dataset of p22/P22/P22 (Girls' Frontline), containing 25 images and their tags.
The core tags of this character are `blue_eyes, short_hair, bangs, breasts, hair_between_eyes, black_hair, earrings, grey_hair, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 25 | 25.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 25 | 16.85 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 49 | 29.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 25 | 23.06 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 49 | 39.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/p22_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/p22_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | looking_at_viewer, solo, 1girl, blue_jacket, cleavage, navel, black_shorts, black_thighhighs, blush, checkered_flag, fingerless_gloves, full_body, highleg_panties, race_queen, short_shorts, bikini, headset, high_heels, official_alternate_costume, sitting, thigh_boots, black_gloves, blue_panties, collarbone, cropped_jacket, holding_flag, open_clothes, smile |
| 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, solo, looking_at_viewer, jewelry, smile, bare_shoulders, jacket, closed_mouth, sleeveless, black_nails, handgun, holding_gun, long_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | looking_at_viewer | solo | 1girl | blue_jacket | cleavage | navel | black_shorts | black_thighhighs | blush | checkered_flag | fingerless_gloves | full_body | highleg_panties | race_queen | short_shorts | bikini | headset | high_heels | official_alternate_costume | sitting | thigh_boots | black_gloves | blue_panties | collarbone | cropped_jacket | holding_flag | open_clothes | smile | jewelry | bare_shoulders | jacket | closed_mouth | sleeveless | black_nails | handgun | holding_gun | long_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------|:-------|:--------|:--------------|:-----------|:--------|:---------------|:-------------------|:--------|:-----------------|:--------------------|:------------|:------------------|:-------------|:---------------|:---------|:----------|:-------------|:-----------------------------|:----------|:--------------|:---------------|:---------------|:-------------|:-----------------|:---------------|:---------------|:--------|:----------|:-----------------|:---------|:---------------|:-------------|:--------------|:----------|:--------------|:---------------|
| 0 | 7 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 18 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/p22_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:46:54+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:53:40+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of p22/P22/P22 (Girls' Frontline)
=========================================
This is the dataset of p22/P22/P22 (Girls' Frontline), containing 25 images and their tags.
The core tags of this character are 'blue\_eyes, short\_hair, bangs, breasts, hair\_between\_eyes, black\_hair, earrings, grey\_hair, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ab3167b14a5a1eb824f41b7e0be241c8c5f071f3 |
# Dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline)
This is the dataset of gr_mg23/GrMG23/HK23 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `breasts, double_bun, hair_bun, long_hair, blonde_hair, large_breasts, purple_eyes, bangs, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 14.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 30 | 17.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 12.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 30 | 26.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_mg23_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_mg23_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, blush, gloves, looking_at_viewer, long_sleeves, open_mouth, white_background, black_skirt, black_thighhighs, pleated_skirt, black_jacket, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | blush | gloves | looking_at_viewer | long_sleeves | open_mouth | white_background | black_skirt | black_thighhighs | pleated_skirt | black_jacket | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------|:--------------------|:---------------|:-------------|:-------------------|:--------------|:-------------------|:----------------|:---------------|:--------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gr_mg23_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T03:46:55+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T03:49:42+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of gr\_mg23/GrMG23/HK23 (Girls' Frontline)
==================================================
This is the dataset of gr\_mg23/GrMG23/HK23 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are 'breasts, double\_bun, hair\_bun, long\_hair, blonde\_hair, large\_breasts, purple\_eyes, bangs, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
12a9af32d6f23aa0497025920dbb79c46e97deaf |
# Dataset Card for Evaluation run of dfurman/HermesBagel-34B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dfurman/HermesBagel-34B-v0.1](https://huggingface.co/dfurman/HermesBagel-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T03:53:56.861170](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1/blob/main/results_2024-01-14T03-53-56.861170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7695763625614322,
"acc_stderr": 0.02793431209028075,
"acc_norm": 0.7740465788313311,
"acc_norm_stderr": 0.028460203996252778,
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.673352473186811,
"mc2_stderr": 0.014617965588559495
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.013678810399518822,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.01331852846053942
},
"harness|hellaswag|10": {
"acc": 0.6638119896434973,
"acc_stderr": 0.004714386376337134,
"acc_norm": 0.8573989245170285,
"acc_norm_stderr": 0.003489509493001622
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066653,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8947368421052632,
"acc_stderr": 0.024974533450920697,
"acc_norm": 0.8947368421052632,
"acc_norm_stderr": 0.024974533450920697
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036844,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036844
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.026355158413349414,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.026355158413349414
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924813,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924813
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6957671957671958,
"acc_stderr": 0.02369541500946309,
"acc_norm": 0.6957671957671958,
"acc_norm_stderr": 0.02369541500946309
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5634920634920635,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.5634920634920635,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9096774193548387,
"acc_stderr": 0.016306570644488313,
"acc_norm": 0.9096774193548387,
"acc_norm_stderr": 0.016306570644488313
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8606060606060606,
"acc_stderr": 0.027045948825865394,
"acc_norm": 0.8606060606060606,
"acc_norm_stderr": 0.027045948825865394
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9343434343434344,
"acc_stderr": 0.01764652667723332,
"acc_norm": 0.9343434343434344,
"acc_norm_stderr": 0.01764652667723332
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9689119170984456,
"acc_stderr": 0.012525310625527033,
"acc_norm": 0.9689119170984456,
"acc_norm_stderr": 0.012525310625527033
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.01889552448260495,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.01889552448260495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.030296771286067323,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.030296771286067323
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.02273020811930654,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.02273020811930654
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5298013245033113,
"acc_stderr": 0.040752249922169796,
"acc_norm": 0.5298013245033113,
"acc_norm_stderr": 0.040752249922169796
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9211009174311927,
"acc_stderr": 0.011558198113769584,
"acc_norm": 0.9211009174311927,
"acc_norm_stderr": 0.011558198113769584
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6620370370370371,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.6620370370370371,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8026905829596412,
"acc_stderr": 0.02670985334496796,
"acc_norm": 0.8026905829596412,
"acc_norm_stderr": 0.02670985334496796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9008264462809917,
"acc_stderr": 0.027285246312758957,
"acc_norm": 0.9008264462809917,
"acc_norm_stderr": 0.027285246312758957
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.030381596756651655,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.030381596756651655
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8773006134969326,
"acc_stderr": 0.025777328426978927,
"acc_norm": 0.8773006134969326,
"acc_norm_stderr": 0.025777328426978927
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8932038834951457,
"acc_stderr": 0.030581088928331366,
"acc_norm": 0.8932038834951457,
"acc_norm_stderr": 0.030581088928331366
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352202,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352202
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9054916985951469,
"acc_stderr": 0.010461015338193071,
"acc_norm": 0.9054916985951469,
"acc_norm_stderr": 0.010461015338193071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7977653631284917,
"acc_stderr": 0.013433729483320979,
"acc_norm": 0.7977653631284917,
"acc_norm_stderr": 0.013433729483320979
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8398692810457516,
"acc_stderr": 0.020998740930362303,
"acc_norm": 0.8398692810457516,
"acc_norm_stderr": 0.020998740930362303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.819935691318328,
"acc_stderr": 0.02182342285774494,
"acc_norm": 0.819935691318328,
"acc_norm_stderr": 0.02182342285774494
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8827160493827161,
"acc_stderr": 0.017903112615281123,
"acc_norm": 0.8827160493827161,
"acc_norm_stderr": 0.017903112615281123
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6560283687943262,
"acc_stderr": 0.02833801742861133,
"acc_norm": 0.6560283687943262,
"acc_norm_stderr": 0.02833801742861133
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6003911342894394,
"acc_stderr": 0.01251018163696068,
"acc_norm": 0.6003911342894394,
"acc_norm_stderr": 0.01251018163696068
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8419117647058824,
"acc_stderr": 0.022161462608068522,
"acc_norm": 0.8419117647058824,
"acc_norm_stderr": 0.022161462608068522
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8169934640522876,
"acc_stderr": 0.015643069911273344,
"acc_norm": 0.8169934640522876,
"acc_norm_stderr": 0.015643069911273344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8571428571428571,
"acc_stderr": 0.022401787435256396,
"acc_norm": 0.8571428571428571,
"acc_norm_stderr": 0.022401787435256396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.02876234912646613,
"acc_norm": 0.91,
"acc_norm_stderr": 0.02876234912646613
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598053,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598053
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5006119951040392,
"mc1_stderr": 0.01750348793889251,
"mc2": 0.673352473186811,
"mc2_stderr": 0.014617965588559495
},
"harness|winogrande|5": {
"acc": 0.846093133385951,
"acc_stderr": 0.010141944523750028
},
"harness|gsm8k|5": {
"acc": 0.6527672479150872,
"acc_stderr": 0.013113898382146875
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1 | [
"region:us"
] | 2024-01-14T03:56:07+00:00 | {"pretty_name": "Evaluation run of dfurman/HermesBagel-34B-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [dfurman/HermesBagel-34B-v0.1](https://huggingface.co/dfurman/HermesBagel-34B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T03:53:56.861170](https://huggingface.co/datasets/open-llm-leaderboard/details_dfurman__HermesBagel-34B-v0.1/blob/main/results_2024-01-14T03-53-56.861170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7695763625614322,\n \"acc_stderr\": 0.02793431209028075,\n \"acc_norm\": 0.7740465788313311,\n \"acc_norm_stderr\": 0.028460203996252778,\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.673352473186811,\n \"mc2_stderr\": 0.014617965588559495\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.013678810399518822,\n \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.01331852846053942\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6638119896434973,\n \"acc_stderr\": 0.004714386376337134,\n \"acc_norm\": 0.8573989245170285,\n \"acc_norm_stderr\": 0.003489509493001622\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066653,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8947368421052632,\n \"acc_stderr\": 0.024974533450920697,\n \"acc_norm\": 0.8947368421052632,\n \"acc_norm_stderr\": 0.024974533450920697\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036844,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036844\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.026355158413349414,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.026355158413349414\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924813,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924813\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.6957671957671958,\n \"acc_stderr\": 0.02369541500946309,\n \"acc_norm\": 0.6957671957671958,\n \"acc_norm_stderr\": 0.02369541500946309\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5634920634920635,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.5634920634920635,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9096774193548387,\n \"acc_stderr\": 0.016306570644488313,\n \"acc_norm\": 0.9096774193548387,\n \"acc_norm_stderr\": 0.016306570644488313\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8606060606060606,\n \"acc_stderr\": 0.027045948825865394,\n \"acc_norm\": 0.8606060606060606,\n \"acc_norm_stderr\": 0.027045948825865394\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9343434343434344,\n \"acc_stderr\": 0.01764652667723332,\n \"acc_norm\": 0.9343434343434344,\n \"acc_norm_stderr\": 0.01764652667723332\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9689119170984456,\n \"acc_stderr\": 0.012525310625527033,\n \"acc_norm\": 0.9689119170984456,\n \"acc_norm_stderr\": 0.012525310625527033\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.01889552448260495,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.01889552448260495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.030296771286067323,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.030296771286067323\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.02273020811930654,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.02273020811930654\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.5298013245033113,\n \"acc_stderr\": 0.040752249922169796,\n \"acc_norm\": 0.5298013245033113,\n \"acc_norm_stderr\": 0.040752249922169796\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9211009174311927,\n \"acc_stderr\": 0.011558198113769584,\n \"acc_norm\": 0.9211009174311927,\n \"acc_norm_stderr\": 0.011558198113769584\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6620370370370371,\n \"acc_stderr\": 0.03225941352631295,\n \"acc_norm\": 0.6620370370370371,\n \"acc_norm_stderr\": 0.03225941352631295\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8026905829596412,\n \"acc_stderr\": 0.02670985334496796,\n \"acc_norm\": 0.8026905829596412,\n \"acc_norm_stderr\": 0.02670985334496796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.9008264462809917,\n \"acc_stderr\": 0.027285246312758957,\n \"acc_norm\": 0.9008264462809917,\n \"acc_norm_stderr\": 0.027285246312758957\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.030381596756651655,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.030381596756651655\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8773006134969326,\n \"acc_stderr\": 0.025777328426978927,\n \"acc_norm\": 0.8773006134969326,\n \"acc_norm_stderr\": 0.025777328426978927\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8932038834951457,\n \"acc_stderr\": 0.030581088928331366,\n \"acc_norm\": 0.8932038834951457,\n \"acc_norm_stderr\": 0.030581088928331366\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352202,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352202\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9054916985951469,\n \"acc_stderr\": 0.010461015338193071,\n \"acc_norm\": 0.9054916985951469,\n \"acc_norm_stderr\": 0.010461015338193071\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7977653631284917,\n \"acc_stderr\": 0.013433729483320979,\n \"acc_norm\": 0.7977653631284917,\n \"acc_norm_stderr\": 0.013433729483320979\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8398692810457516,\n \"acc_stderr\": 0.020998740930362303,\n \"acc_norm\": 0.8398692810457516,\n \"acc_norm_stderr\": 0.020998740930362303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.819935691318328,\n \"acc_stderr\": 0.02182342285774494,\n \"acc_norm\": 0.819935691318328,\n \"acc_norm_stderr\": 0.02182342285774494\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8827160493827161,\n \"acc_stderr\": 0.017903112615281123,\n \"acc_norm\": 0.8827160493827161,\n \"acc_norm_stderr\": 0.017903112615281123\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6560283687943262,\n \"acc_stderr\": 0.02833801742861133,\n \"acc_norm\": 0.6560283687943262,\n \"acc_norm_stderr\": 0.02833801742861133\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6003911342894394,\n \"acc_stderr\": 0.01251018163696068,\n \"acc_norm\": 0.6003911342894394,\n \"acc_norm_stderr\": 0.01251018163696068\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8419117647058824,\n \"acc_stderr\": 0.022161462608068522,\n \"acc_norm\": 0.8419117647058824,\n \"acc_norm_stderr\": 0.022161462608068522\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8169934640522876,\n \"acc_stderr\": 0.015643069911273344,\n \"acc_norm\": 0.8169934640522876,\n \"acc_norm_stderr\": 0.015643069911273344\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8571428571428571,\n \"acc_stderr\": 0.022401787435256396,\n \"acc_norm\": 0.8571428571428571,\n \"acc_norm_stderr\": 0.022401787435256396\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.02876234912646613,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.02876234912646613\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598053,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598053\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5006119951040392,\n \"mc1_stderr\": 0.01750348793889251,\n \"mc2\": 0.673352473186811,\n \"mc2_stderr\": 0.014617965588559495\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.846093133385951,\n \"acc_stderr\": 0.010141944523750028\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6527672479150872,\n \"acc_stderr\": 0.013113898382146875\n }\n}\n```", "repo_url": "https://huggingface.co/dfurman/HermesBagel-34B-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["**/details_harness|winogrande|5_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T03-53-56.861170.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T03_53_56.861170", "path": ["results_2024-01-14T03-53-56.861170.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T03-53-56.861170.parquet"]}]}]} | 2024-01-14T03:56:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of dfurman/HermesBagel-34B-v0.1
Dataset automatically created during the evaluation run of model dfurman/HermesBagel-34B-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T03:53:56.861170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of dfurman/HermesBagel-34B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/HermesBagel-34B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T03:53:56.861170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of dfurman/HermesBagel-34B-v0.1\n\n\n\nDataset automatically created during the evaluation run of model dfurman/HermesBagel-34B-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T03:53:56.861170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0dafc248bfd177e774c34ea45b9b70abb392f387 |
# AutoLamella Dataset
The autolamella dataset consists of images from multiple different lamella preparation methods. All data is annotated for semantic segmentation, and is available through the huggingface api at [patrickcleeve/autolamella](https://huggingface.co/datasets/patrickcleeve/autolamella)
Summary
| Dataset / Method | Train | Test | Total |
| ----------- | ----------- | -----------| -----------|
| Waffle | 214 | 76 | 290 |
| Liftout | 801 | 163 | 969 |
| Serial Liftout | 301 | 109 | 412 |
| **Full** | **1316** | **348** | **1664** |
Details about the datasets can be found in summary.csv in the dataset directory.
### Labels
Currently, the dataset is labelled for the following classes. In the future, we will add additional labels for objects such as ice contamination. If you would like to label this data, please see the labelling tools to get started.
```yaml
CLASS_LABELS: # autolamella
0: "background"
1: "lamella"
2: "manipulator"
3: "landing_post"
4: "copper_adaptor"
5: "volume_block"
```
## Download Datasets
To download datasets, you can use the huggingface api:
```python
from datasets import load_dataset
# download waffle dataset
ds = load_dataset("patrickcleeve/autolamella", name="waffle")
# download liftout dataset
ds = load_dataset("patrickcleeve/autolamella", name="liftout")
# download serial-liftout dataset
ds = load_dataset("patrickcleeve/autolamella", name="serial-liftout")
# download test split only
ds = load_dataset("patrickcleeve/autolamella", name="waffle", split="test")
```
To display images and annotations:
```python
# show random image image and annotation (training split)
import random
import numpy as np
import matplotlib.pyplot as plt
from fibsem.segmentation.utils import decode_segmap_v2
# random data
idx = random.randint(0, len(ds["train"]))
image = np.asarray(ds["train"][idx]["image"])
mask = np.asarray(ds["train"][idx]["annotation"])
# metadata
split = ds["train"].split
config_name = ds["train"].config_name
plt.title(f"{config_name}-{split}-{idx:02d}")
plt.imshow(image, cmap="gray", alpha=0.7)
plt.imshow(decode_segmap_v2(mask), alpha=0.3)
plt.axis("off")
plt.show()
```
| Waffle | Liftout | Serial Liftout |
| ----------- | ----------- | ----------- |
| ![WaffleData](assets/show_waffle.png) | ![LiftoutData](assets/show_liftout.png) | ![LiftoutData](assets/show_serial_liftout.png) |
You can also concatenate the datasets together into a single dataset for easy combined training (e.g. mega models)
```python
from datasets import load_dataset, concatenate_datasets
# load invidual datasets
waffle_train_ds = load_dataset("patrickcleeve/autolamella", name="waffle", split="train")
liftout_train_ds = load_dataset("patrickcleeve/autolamella", name="liftout", split="train")
serial_liftout_train_ds = load_dataset("patrickcleeve/autolamella", name="serial-liftout", split="train")
# concatenate datasets (e.g. mega model)
train_ds = concatenate_datasets([waffle_train_ds, liftout_train_ds, serial_liftout_train_ds])
print(train_ds)
```
```yaml
Dataset({
features: ['image', 'annotation'],
num_rows: 1316
})
```
### Acknowledgement
- Waffle and Liftout data from Monash
- Serial Liftout data from MPI
| patrickcleeve/autolamella | [
"license:mit",
"region:us"
] | 2024-01-14T03:59:29+00:00 | {"license": "mit", "dataset_info": [{"config_name": "liftout", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 2479679335.0, "num_examples": 801}, {"name": "test", "num_bytes": 514295427.0, "num_examples": 163}], "download_size": 1540632118, "dataset_size": 2993974762.0}, {"config_name": "serial-liftout", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 946980390.0, "num_examples": 301}, {"name": "test", "num_bytes": 342926454.0, "num_examples": 109}], "download_size": 457168711, "dataset_size": 1289906844.0}, {"config_name": "waffle", "features": [{"name": "image", "dtype": "image"}, {"name": "annotation", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 673435138.0, "num_examples": 214}, {"name": "test", "num_bytes": 239208412.0, "num_examples": 76}], "download_size": 477754123, "dataset_size": 912643550.0}], "configs": [{"config_name": "liftout", "data_files": [{"split": "train", "path": "liftout/train-*"}, {"split": "test", "path": "liftout/test-*"}]}, {"config_name": "serial-liftout", "data_files": [{"split": "train", "path": "serial-liftout/train-*"}, {"split": "test", "path": "serial-liftout/test-*"}]}, {"config_name": "waffle", "data_files": [{"split": "train", "path": "waffle/train-*"}, {"split": "test", "path": "waffle/test-*"}]}]} | 2024-01-21T10:49:41+00:00 | [] | [] | TAGS
#license-mit #region-us
| AutoLamella Dataset
===================
The autolamella dataset consists of images from multiple different lamella preparation methods. All data is annotated for semantic segmentation, and is available through the huggingface api at patrickcleeve/autolamella
Summary
Details about the datasets can be found in URL in the dataset directory.
### Labels
Currently, the dataset is labelled for the following classes. In the future, we will add additional labels for objects such as ice contamination. If you would like to label this data, please see the labelling tools to get started.
Download Datasets
-----------------
To download datasets, you can use the huggingface api:
To display images and annotations:
Waffle: !WaffleData, Liftout: !LiftoutData, Serial Liftout: !LiftoutData
You can also concatenate the datasets together into a single dataset for easy combined training (e.g. mega models)
### Acknowledgement
* Waffle and Liftout data from Monash
* Serial Liftout data from MPI
| [
"### Labels\n\n\nCurrently, the dataset is labelled for the following classes. In the future, we will add additional labels for objects such as ice contamination. If you would like to label this data, please see the labelling tools to get started.\n\n\nDownload Datasets\n-----------------\n\n\nTo download datasets, you can use the huggingface api:\n\n\nTo display images and annotations:\n\n\nWaffle: !WaffleData, Liftout: !LiftoutData, Serial Liftout: !LiftoutData\n\n\nYou can also concatenate the datasets together into a single dataset for easy combined training (e.g. mega models)",
"### Acknowledgement\n\n\n* Waffle and Liftout data from Monash\n* Serial Liftout data from MPI"
] | [
"TAGS\n#license-mit #region-us \n",
"### Labels\n\n\nCurrently, the dataset is labelled for the following classes. In the future, we will add additional labels for objects such as ice contamination. If you would like to label this data, please see the labelling tools to get started.\n\n\nDownload Datasets\n-----------------\n\n\nTo download datasets, you can use the huggingface api:\n\n\nTo display images and annotations:\n\n\nWaffle: !WaffleData, Liftout: !LiftoutData, Serial Liftout: !LiftoutData\n\n\nYou can also concatenate the datasets together into a single dataset for easy combined training (e.g. mega models)",
"### Acknowledgement\n\n\n* Waffle and Liftout data from Monash\n* Serial Liftout data from MPI"
] |
ef8e2a527e2f3cd53a649c98858122bae2f428ed |
# Dataset of cx4_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline)
This is the dataset of cx4_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are `black_hair, long_hair, bow, breasts, red_eyes, hair_bow, red_bow, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 15 | 15.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 15 | 9.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 33 | 18.42 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 15 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 33 | 24.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cx4_storm_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cx4_storm_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, looking_at_viewer, blush, navel, open_mouth, simple_background, black_panties, black_thighhighs, garter_straps, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | looking_at_viewer | blush | navel | open_mouth | simple_background | black_panties | black_thighhighs | garter_straps | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------------|:--------|:--------|:-------------|:--------------------|:----------------|:-------------------|:----------------|:-------------------|
| 0 | 15 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/cx4_storm_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:23:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:26:37+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cx4\_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline)
======================================================
This is the dataset of cx4\_storm/Cx4ストーム/Cx4风暴 (Girls' Frontline), containing 15 images and their tags.
The core tags of this character are 'black\_hair, long\_hair, bow, breasts, red\_eyes, hair\_bow, red\_bow, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
ee2c899c5eae52ae85cb929ad5a08c407956c584 |
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-2x34B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-2x34B](https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T04:24:57.713282](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B/blob/main/results_2024-01-14T04-24-57.713282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7687937231792787,
"acc_stderr": 0.027887592122908762,
"acc_norm": 0.7725082714288936,
"acc_norm_stderr": 0.028420468097469523,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6482085164957936,
"mc2_stderr": 0.01484519519589757
},
"harness|arc:challenge|25": {
"acc": 0.6749146757679181,
"acc_stderr": 0.013688147309729119,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716417
},
"harness|hellaswag|10": {
"acc": 0.6595299741087433,
"acc_stderr": 0.004728988167338544,
"acc_norm": 0.8526190001991635,
"acc_norm_stderr": 0.0035376085010691773
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.03785714465066652,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.03785714465066652
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752271,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752271
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8037735849056604,
"acc_stderr": 0.024442388131100806,
"acc_norm": 0.8037735849056604,
"acc_norm_stderr": 0.024442388131100806
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7957446808510639,
"acc_stderr": 0.02635515841334941,
"acc_norm": 0.7957446808510639,
"acc_norm_stderr": 0.02635515841334941
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7793103448275862,
"acc_stderr": 0.03455930201924813,
"acc_norm": 0.7793103448275862,
"acc_norm_stderr": 0.03455930201924813
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02351729433596328,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02351729433596328
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5793650793650794,
"acc_stderr": 0.04415438226743745,
"acc_norm": 0.5793650793650794,
"acc_norm_stderr": 0.04415438226743745
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9032258064516129,
"acc_stderr": 0.016818943416345197,
"acc_norm": 0.9032258064516129,
"acc_norm_stderr": 0.016818943416345197
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6403940886699507,
"acc_stderr": 0.03376458246509567,
"acc_norm": 0.6403940886699507,
"acc_norm_stderr": 0.03376458246509567
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.025485498373343237,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.025485498373343237
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01699999492742161,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01699999492742161
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9740932642487047,
"acc_stderr": 0.01146452335695318,
"acc_norm": 0.9740932642487047,
"acc_norm_stderr": 0.01146452335695318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8179487179487179,
"acc_stderr": 0.019565236782930887,
"acc_norm": 0.8179487179487179,
"acc_norm_stderr": 0.019565236782930887
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.030078013075022055,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.030078013075022055
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8613445378151261,
"acc_stderr": 0.02244826447683259,
"acc_norm": 0.8613445378151261,
"acc_norm_stderr": 0.02244826447683259
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9100917431192661,
"acc_stderr": 0.012264304540230446,
"acc_norm": 0.9100917431192661,
"acc_norm_stderr": 0.012264304540230446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6898148148148148,
"acc_stderr": 0.03154696285656629,
"acc_norm": 0.6898148148148148,
"acc_norm_stderr": 0.03154696285656629
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9313725490196079,
"acc_stderr": 0.017744453647073322,
"acc_norm": 0.9313725490196079,
"acc_norm_stderr": 0.017744453647073322
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280226,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280226
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8925619834710744,
"acc_stderr": 0.028268812192540627,
"acc_norm": 0.8925619834710744,
"acc_norm_stderr": 0.028268812192540627
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.02684576505455385,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.02684576505455385
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6071428571428571,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.6071428571428571,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.9029126213592233,
"acc_stderr": 0.02931596291881348,
"acc_norm": 0.9029126213592233,
"acc_norm_stderr": 0.02931596291881348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9273504273504274,
"acc_stderr": 0.01700436856813234,
"acc_norm": 0.9273504273504274,
"acc_norm_stderr": 0.01700436856813234
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.909323116219668,
"acc_stderr": 0.010268429662528547,
"acc_norm": 0.909323116219668,
"acc_norm_stderr": 0.010268429662528547
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8265895953757225,
"acc_stderr": 0.020383229551135033,
"acc_norm": 0.8265895953757225,
"acc_norm_stderr": 0.020383229551135033
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7899441340782123,
"acc_stderr": 0.013623755371333533,
"acc_norm": 0.7899441340782123,
"acc_norm_stderr": 0.013623755371333533
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.01970403918385981,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.01970403918385981
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8327974276527331,
"acc_stderr": 0.021193872528034962,
"acc_norm": 0.8327974276527331,
"acc_norm_stderr": 0.021193872528034962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8919753086419753,
"acc_stderr": 0.01727176308448352,
"acc_norm": 0.8919753086419753,
"acc_norm_stderr": 0.01727176308448352
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6595744680851063,
"acc_stderr": 0.028267657482650158,
"acc_norm": 0.6595744680851063,
"acc_norm_stderr": 0.028267657482650158
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.6069100391134289,
"acc_stderr": 0.012474899613873955,
"acc_norm": 0.6069100391134289,
"acc_norm_stderr": 0.012474899613873955
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8455882352941176,
"acc_stderr": 0.021950024722922033,
"acc_norm": 0.8455882352941176,
"acc_norm_stderr": 0.021950024722922033
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.01553374508338279,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.01553374508338279
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8489795918367347,
"acc_stderr": 0.022923004094736847,
"acc_norm": 0.8489795918367347,
"acc_norm_stderr": 0.022923004094736847
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659393,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659393
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961574,
"mc2": 0.6482085164957936,
"mc2_stderr": 0.01484519519589757
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065609
},
"harness|gsm8k|5": {
"acc": 0.6868840030326004,
"acc_stderr": 0.012774285669385096
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34b | [
"region:us"
] | 2024-01-14T04:27:11+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Bagel-Hermes-2x34B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Bagel-Hermes-2x34B](https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T04:24:57.713282](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Bagel-Hermes-2x34B/blob/main/results_2024-01-14T04-24-57.713282.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7687937231792787,\n \"acc_stderr\": 0.027887592122908762,\n \"acc_norm\": 0.7725082714288936,\n \"acc_norm_stderr\": 0.028420468097469523,\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6482085164957936,\n \"mc2_stderr\": 0.01484519519589757\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6749146757679181,\n \"acc_stderr\": 0.013688147309729119,\n \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716417\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6595299741087433,\n \"acc_stderr\": 0.004728988167338544,\n \"acc_norm\": 0.8526190001991635,\n \"acc_norm_stderr\": 0.0035376085010691773\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.03785714465066652,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.03785714465066652\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752271,\n \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752271\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8037735849056604,\n \"acc_stderr\": 0.024442388131100806,\n \"acc_norm\": 0.8037735849056604,\n \"acc_norm_stderr\": 0.024442388131100806\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7957446808510639,\n \"acc_stderr\": 0.02635515841334941,\n \"acc_norm\": 0.7957446808510639,\n \"acc_norm_stderr\": 0.02635515841334941\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7793103448275862,\n \"acc_stderr\": 0.03455930201924813,\n \"acc_norm\": 0.7793103448275862,\n \"acc_norm_stderr\": 0.03455930201924813\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02351729433596328,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02351729433596328\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5793650793650794,\n \"acc_stderr\": 0.04415438226743745,\n \"acc_norm\": 0.5793650793650794,\n \"acc_norm_stderr\": 0.04415438226743745\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9032258064516129,\n \"acc_stderr\": 0.016818943416345197,\n \"acc_norm\": 0.9032258064516129,\n \"acc_norm_stderr\": 0.016818943416345197\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6403940886699507,\n \"acc_stderr\": 0.03376458246509567,\n \"acc_norm\": 0.6403940886699507,\n \"acc_norm_stderr\": 0.03376458246509567\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.025485498373343237,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.025485498373343237\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01699999492742161,\n \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01699999492742161\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9740932642487047,\n \"acc_stderr\": 0.01146452335695318,\n \"acc_norm\": 0.9740932642487047,\n \"acc_norm_stderr\": 0.01146452335695318\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8179487179487179,\n \"acc_stderr\": 0.019565236782930887,\n \"acc_norm\": 0.8179487179487179,\n \"acc_norm_stderr\": 0.019565236782930887\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.4185185185185185,\n \"acc_stderr\": 0.030078013075022055,\n \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.030078013075022055\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8613445378151261,\n \"acc_stderr\": 0.02244826447683259,\n \"acc_norm\": 0.8613445378151261,\n \"acc_norm_stderr\": 0.02244826447683259\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9100917431192661,\n \"acc_stderr\": 0.012264304540230446,\n \"acc_norm\": 0.9100917431192661,\n \"acc_norm_stderr\": 0.012264304540230446\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6898148148148148,\n \"acc_stderr\": 0.03154696285656629,\n \"acc_norm\": 0.6898148148148148,\n \"acc_norm_stderr\": 0.03154696285656629\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9313725490196079,\n \"acc_stderr\": 0.017744453647073322,\n \"acc_norm\": 0.9313725490196079,\n \"acc_norm_stderr\": 0.017744453647073322\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280226,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280226\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8925619834710744,\n \"acc_stderr\": 0.028268812192540627,\n \"acc_norm\": 0.8925619834710744,\n \"acc_norm_stderr\": 0.028268812192540627\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.02684576505455385,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.02684576505455385\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6071428571428571,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.6071428571428571,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.9029126213592233,\n \"acc_stderr\": 0.02931596291881348,\n \"acc_norm\": 0.9029126213592233,\n \"acc_norm_stderr\": 0.02931596291881348\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9273504273504274,\n \"acc_stderr\": 0.01700436856813234,\n \"acc_norm\": 0.9273504273504274,\n \"acc_norm_stderr\": 0.01700436856813234\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.909323116219668,\n \"acc_stderr\": 0.010268429662528547,\n \"acc_norm\": 0.909323116219668,\n \"acc_norm_stderr\": 0.010268429662528547\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8265895953757225,\n \"acc_stderr\": 0.020383229551135033,\n \"acc_norm\": 0.8265895953757225,\n \"acc_norm_stderr\": 0.020383229551135033\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7899441340782123,\n \"acc_stderr\": 0.013623755371333533,\n \"acc_norm\": 0.7899441340782123,\n \"acc_norm_stderr\": 0.013623755371333533\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.01970403918385981,\n \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.01970403918385981\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8327974276527331,\n \"acc_stderr\": 0.021193872528034962,\n \"acc_norm\": 0.8327974276527331,\n \"acc_norm_stderr\": 0.021193872528034962\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8919753086419753,\n \"acc_stderr\": 0.01727176308448352,\n \"acc_norm\": 0.8919753086419753,\n \"acc_norm_stderr\": 0.01727176308448352\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6595744680851063,\n \"acc_stderr\": 0.028267657482650158,\n \"acc_norm\": 0.6595744680851063,\n \"acc_norm_stderr\": 0.028267657482650158\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.6069100391134289,\n \"acc_stderr\": 0.012474899613873955,\n \"acc_norm\": 0.6069100391134289,\n \"acc_norm_stderr\": 0.012474899613873955\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8455882352941176,\n \"acc_stderr\": 0.021950024722922033,\n \"acc_norm\": 0.8455882352941176,\n \"acc_norm_stderr\": 0.021950024722922033\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.01553374508338279,\n \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.01553374508338279\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8489795918367347,\n \"acc_stderr\": 0.022923004094736847,\n \"acc_norm\": 0.8489795918367347,\n \"acc_norm_stderr\": 0.022923004094736847\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n \"acc_stderr\": 0.021166216304659393,\n \"acc_norm\": 0.900497512437811,\n \"acc_norm_stderr\": 0.021166216304659393\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n \"mc1_stderr\": 0.017483547156961574,\n \"mc2\": 0.6482085164957936,\n \"mc2_stderr\": 0.01484519519589757\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065609\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6868840030326004,\n \"acc_stderr\": 0.012774285669385096\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Bagel-Hermes-2x34B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["**/details_harness|winogrande|5_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T04-24-57.713282.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T04_24_57.713282", "path": ["results_2024-01-14T04-24-57.713282.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T04-24-57.713282.parquet"]}]}]} | 2024-01-25T08:33:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-2x34B
Dataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-2x34B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T04:24:57.713282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-2x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T04:24:57.713282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Bagel-Hermes-2x34B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Bagel-Hermes-2x34B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T04:24:57.713282(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
78a9d79266e62e1fa16775b7d6d32b09b26be756 |
# Dataset of pzb39/PzB39/PzB39 (Girls' Frontline)
This is the dataset of pzb39/PzB39/PzB39 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `black_hair, breasts, long_hair, bangs, red_eyes, very_long_hair, hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 12.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 7.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 19 | 12.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 10.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 19 | 18.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pzb39_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pzb39_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, gloves, holding, smile, simple_background, white_background, arm_tattoo, bare_shoulders, black_jacket, black_necktie, black_pants, closed_mouth, ground_vehicle, gun, headwear_removed, long_sleeves, motorcycle, red_choker, uniform |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | gloves | holding | smile | simple_background | white_background | arm_tattoo | bare_shoulders | black_jacket | black_necktie | black_pants | closed_mouth | ground_vehicle | gun | headwear_removed | long_sleeves | motorcycle | red_choker | uniform |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------|:----------|:--------|:--------------------|:-------------------|:-------------|:-----------------|:---------------|:----------------|:--------------|:---------------|:-----------------|:------|:-------------------|:---------------|:-------------|:-------------|:----------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/pzb39_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:46:44+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:49:46+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of pzb39/PzB39/PzB39 (Girls' Frontline)
===============================================
This is the dataset of pzb39/PzB39/PzB39 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are 'black\_hair, breasts, long\_hair, bangs, red\_eyes, very\_long\_hair, hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
785109254fdc40dd7e1ee0687417568ee043fa61 |
# Dataset of gr_psg_1/GrPSG-1/PSG-1 (Girls' Frontline)
This is the dataset of gr_psg_1/GrPSG-1/PSG-1 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are `breasts, ponytail, long_hair, hair_ornament, grey_eyes, white_hair, bangs, medium_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 8.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 6.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 12.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 8.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 14.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/gr_psg_1_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/gr_psg_1_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, jacket, sniper_rifle, bikini_top_only, full_body, black_pantyhose, closed_mouth, front-tie_top, black_bikini, black_footwear, cleavage, collarbone, navel, open_clothes, scope, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | jacket | sniper_rifle | bikini_top_only | full_body | black_pantyhose | closed_mouth | front-tie_top | black_bikini | black_footwear | cleavage | collarbone | navel | open_clothes | scope | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------|:---------------|:------------------|:------------|:------------------|:---------------|:----------------|:---------------|:-----------------|:-----------|:-------------|:--------|:---------------|:--------|:--------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/gr_psg_1_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T04:46:46+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T04:49:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of gr\_psg\_1/GrPSG-1/PSG-1 (Girls' Frontline)
======================================================
This is the dataset of gr\_psg\_1/GrPSG-1/PSG-1 (Girls' Frontline), containing 12 images and their tags.
The core tags of this character are 'breasts, ponytail, long\_hair, hair\_ornament, grey\_eyes, white\_hair, bangs, medium\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
8a7be6a2da9fd5e94aecf35398ef269f7ccf891e |
# Dataset Card for Evaluation run of harborwater/dpo-test-hermes-open-llama-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [harborwater/dpo-test-hermes-open-llama-3b](https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T04:56:07.071188](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b/blob/main/results_2024-01-14T04-56-07.071188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2514093021467422,
"acc_stderr": 0.03052650097964464,
"acc_norm": 0.25202173312622367,
"acc_norm_stderr": 0.03127688845727799,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3980562710501165,
"mc2_stderr": 0.014269053798319005
},
"harness|arc:challenge|25": {
"acc": 0.36689419795221845,
"acc_stderr": 0.014084133118104292,
"acc_norm": 0.3924914675767918,
"acc_norm_stderr": 0.014269634635670712
},
"harness|hellaswag|10": {
"acc": 0.5091615216092412,
"acc_stderr": 0.004988943721711217,
"acc_norm": 0.6745668193586934,
"acc_norm_stderr": 0.004675789156977649
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.03547854198560824,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.03547854198560824
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123415,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123415
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310675,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310675
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641145,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641145
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.020940481565334866,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.020940481565334866
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18387096774193548,
"acc_stderr": 0.022037217340267833,
"acc_norm": 0.18387096774193548,
"acc_norm_stderr": 0.022037217340267833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206824,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206824
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538783,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538783
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817234,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2282051282051282,
"acc_stderr": 0.02127839386358628,
"acc_norm": 0.2282051282051282,
"acc_norm_stderr": 0.02127839386358628
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21467889908256882,
"acc_stderr": 0.017604304149256494,
"acc_norm": 0.21467889908256882,
"acc_norm_stderr": 0.017604304149256494
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.02649191472735516,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.02649191472735516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.043300437496507437,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.043300437496507437
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455766,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455766
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.01405431493561456,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.01405431493561456
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.023152722439402303,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.023152722439402303
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24382716049382716,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.24382716049382716,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.02512373922687241,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.02512373922687241
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.01100597139992724,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.01100597139992724
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20955882352941177,
"acc_stderr": 0.02472311040767705,
"acc_norm": 0.20955882352941177,
"acc_norm_stderr": 0.02472311040767705
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528037,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528037
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2,
"acc_stderr": 0.025607375986579153,
"acc_norm": 0.2,
"acc_norm_stderr": 0.025607375986579153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520672,
"mc2": 0.3980562710501165,
"mc2_stderr": 0.014269053798319005
},
"harness|winogrande|5": {
"acc": 0.6440410418310971,
"acc_stderr": 0.01345674065627396
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480815
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b | [
"region:us"
] | 2024-01-14T04:57:52+00:00 | {"pretty_name": "Evaluation run of harborwater/dpo-test-hermes-open-llama-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [harborwater/dpo-test-hermes-open-llama-3b](https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T04:56:07.071188](https://huggingface.co/datasets/open-llm-leaderboard/details_harborwater__dpo-test-hermes-open-llama-3b/blob/main/results_2024-01-14T04-56-07.071188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2514093021467422,\n \"acc_stderr\": 0.03052650097964464,\n \"acc_norm\": 0.25202173312622367,\n \"acc_norm_stderr\": 0.03127688845727799,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3980562710501165,\n \"mc2_stderr\": 0.014269053798319005\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.36689419795221845,\n \"acc_stderr\": 0.014084133118104292,\n \"acc_norm\": 0.3924914675767918,\n \"acc_norm_stderr\": 0.014269634635670712\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5091615216092412,\n \"acc_stderr\": 0.004988943721711217,\n \"acc_norm\": 0.6745668193586934,\n \"acc_norm_stderr\": 0.004675789156977649\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n \"acc_stderr\": 0.03547854198560824,\n \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.03547854198560824\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123415,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123415\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n \"acc_stderr\": 0.03186209851641145,\n \"acc_norm\": 0.2254335260115607,\n \"acc_norm_stderr\": 0.03186209851641145\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.020940481565334866,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.020940481565334866\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18387096774193548,\n \"acc_stderr\": 0.022037217340267833,\n \"acc_norm\": 0.18387096774193548,\n \"acc_norm_stderr\": 0.022037217340267833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206824,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206824\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538783,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538783\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817234,\n \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817234\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2282051282051282,\n \"acc_stderr\": 0.02127839386358628,\n \"acc_norm\": 0.2282051282051282,\n \"acc_norm_stderr\": 0.02127839386358628\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21467889908256882,\n \"acc_stderr\": 0.017604304149256494,\n \"acc_norm\": 0.21467889908256882,\n \"acc_norm_stderr\": 0.017604304149256494\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.02649191472735516,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.02649191472735516\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.043300437496507437,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.043300437496507437\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455766,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455766\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n \"acc_stderr\": 0.01405431493561456,\n \"acc_norm\": 0.22905027932960895,\n \"acc_norm_stderr\": 0.01405431493561456\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.023152722439402303,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.023152722439402303\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.24758842443729903,\n \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.24382716049382716,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.24382716049382716,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.02512373922687241,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.02512373922687241\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n \"acc_stderr\": 0.01100597139992724,\n \"acc_norm\": 0.24641460234680573,\n \"acc_norm_stderr\": 0.01100597139992724\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20955882352941177,\n \"acc_stderr\": 0.02472311040767705,\n \"acc_norm\": 0.20955882352941177,\n \"acc_norm_stderr\": 0.02472311040767705\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528037,\n \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528037\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.025607375986579153,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.025607375986579153\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520672,\n \"mc2\": 0.3980562710501165,\n \"mc2_stderr\": 0.014269053798319005\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.01345674065627396\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.003195747075480815\n }\n}\n```", "repo_url": "https://huggingface.co/harborwater/dpo-test-hermes-open-llama-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["**/details_harness|winogrande|5_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T04-56-07.071188.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T04_56_07.071188", "path": ["results_2024-01-14T04-56-07.071188.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T04-56-07.071188.parquet"]}]}]} | 2024-01-14T04:58:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of harborwater/dpo-test-hermes-open-llama-3b
Dataset automatically created during the evaluation run of model harborwater/dpo-test-hermes-open-llama-3b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T04:56:07.071188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of harborwater/dpo-test-hermes-open-llama-3b\n\n\n\nDataset automatically created during the evaluation run of model harborwater/dpo-test-hermes-open-llama-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T04:56:07.071188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of harborwater/dpo-test-hermes-open-llama-3b\n\n\n\nDataset automatically created during the evaluation run of model harborwater/dpo-test-hermes-open-llama-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T04:56:07.071188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
7cf8aa676fba15e333eb7ba1d016e2605dfc737b |
# Singlish to English 🇸🇬
> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - [Tessa Wong, BBC News, 2015](https://www.bbc.com/news/magazine-33809914)
This is a synthetic dataset generated by GPT-4.
Each json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.
# Sample entry
```json
singlish: "Eh, chop the garlic - you can a not?",
english: Hey, do you know how to chop the garlic?"
```
# Data Generation Code
```python
import json
import pandas as pd
from openai import OpenAI
client = OpenAI()
NUM_SAMPLE = 10
ACTIVITIES = ['cooking',
'studying',
'sleeping',
'eating',
'working',
'exercising',
'reading',
'cleaning',
'shopping',
'driving',
'walking',
'bathing',
'going to work',
'listening to music',
'watching TV',
'playing video games',
'using a computer',
'texting',
'socializing',
'meditating',
'commuting',
'doing laundry',
'ironing clothes',
'dusting',
'vacuuming',
'painting',
'drawing',
'grocery shopping',
'sewing',
'taking a nap',
'jogging',
'biking',
'swimming',
'playing sports',
'checking emails',
'playing with children',
'watching movies',
'playing board games',
'attending school or classes',
'going to the gym',
'playing a musical instrument',
'singing',
'dancing',
'writing',
'photography',
'traveling',
'visiting friends',
'attending events',
'volunteering',
'attending meetings']
dataset = {}
for index, activity in enumerate(ACTIVITIES):
print(index, activity)
response = client.chat.completions.create(
model="gpt-4-1106-preview",
messages=[{"role": "system",
"content": "You are an expert in translating Singlish to English"},
{"role": "user",
"content": f"Create {NUM_SAMPLE} random Singlish (s) to English (e) translation pairs in json. Write full sentences about {activity}."\
f"Don't exaggerate the use of Singlish, and be natural, as how a real Singaporean would speak."\
f"Start the keys from {(index*NUM_SAMPLE)+1}. For example,"\
"{'X':{'s': 'aiyo, why like that', 'e': 'oh my, how did this happen'}"\
"..., 'X+5': {'s': 'don't play play', 'e': 'don't fool around'} }"}],
temperature=0.01,
response_format={"type":"json_object"}
)
output = response.choices[0].message.content
output_json = json.loads(output)
dataset.update(output_json)
# Save the current state of the combined dictionary
with open('singlish_to_english_v0.1.json', 'w') as f:
json.dump(dataset, f, indent=None)
# Convert to tabular csv
df = pd.read_json("singlish_to_english_v0.1.json")
df = df.T
df = df.reset_index()
df.columns = ["index", "singlish", "english"]
df.to_csv("singlish_to_english_v0.1.csv", index=False)
``` | cyzgab/singlish-to-english-synthetic | [
"task_categories:translation",
"size_categories:n<1K",
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | 2024-01-14T05:17:35+00:00 | {"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["n<1K"], "task_categories": ["translation"], "pretty_name": "Singlish to English \ud83c\uddf8\ud83c\uddec"} | 2024-01-14T07:44:18+00:00 | [] | [
"en"
] | TAGS
#task_categories-translation #size_categories-n<1K #language-English #license-cc-by-nc-sa-4.0 #region-us
|
# Singlish to English 🇸🇬
> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - Tessa Wong, BBC News, 2015
This is a synthetic dataset generated by GPT-4.
Each json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.
# Sample entry
# Data Generation Code
| [
"# Singlish to English 🇸🇬\n\n> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - Tessa Wong, BBC News, 2015\n\nThis is a synthetic dataset generated by GPT-4.\n\nEach json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.",
"# Sample entry",
"# Data Generation Code"
] | [
"TAGS\n#task_categories-translation #size_categories-n<1K #language-English #license-cc-by-nc-sa-4.0 #region-us \n",
"# Singlish to English 🇸🇬\n\n> Singapore is known for its efficiency and Singlish is no different - it's colourful and snappy. - Tessa Wong, BBC News, 2015\n\nThis is a synthetic dataset generated by GPT-4.\n\nEach json pair contains one Singlish sentence about an everyday activity (e.g. cooking) and its English translation.",
"# Sample entry",
"# Data Generation Code"
] |
2d0a331ff2bc9cad7c76f19338226757dc197e06 |
# Dataset of a_545/A-545/A-545 (Girls' Frontline)
This is the dataset of a_545/A-545/A-545 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are `long_hair, bangs, breasts, blonde_hair, braid, twintails, medium_breasts, hat, blue_eyes, aqua_eyes, beret, black_headwear, very_long_hair, braided_bangs, hair_ornament, hairclip`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 36.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 17.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 34.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 29.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 54.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/a_545_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/a_545_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, simple_background, white_background, assault_rifle, black_footwear, bodysuit, black_gloves, closed_mouth, smile, black_thighhighs, dress, alcohol, holding_bottle, full_body, high_heel_boots, holding_gun, sitting, thigh_boots |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | simple_background | white_background | assault_rifle | black_footwear | bodysuit | black_gloves | closed_mouth | smile | black_thighhighs | dress | alcohol | holding_bottle | full_body | high_heel_boots | holding_gun | sitting | thigh_boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------------|:-------------------|:----------------|:-----------------|:-----------|:---------------|:---------------|:--------|:-------------------|:--------|:----------|:-----------------|:------------|:------------------|:--------------|:----------|:--------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/a_545_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T05:23:34+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T05:27:55+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of a\_545/A-545/A-545 (Girls' Frontline)
================================================
This is the dataset of a\_545/A-545/A-545 (Girls' Frontline), containing 20 images and their tags.
The core tags of this character are 'long\_hair, bangs, breasts, blonde\_hair, braid, twintails, medium\_breasts, hat, blue\_eyes, aqua\_eyes, beret, black\_headwear, very\_long\_hair, braided\_bangs, hair\_ornament, hairclip', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
b70c9eff184e2e2759b53c4c88263573369985a2 |
# Dataset of 6p62/6P62/6P62 (Girls' Frontline)
This is the dataset of 6p62/6P62/6P62 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are `long_hair, red_hair, blue_eyes, hat, breasts, large_breasts, bangs, glasses, ponytail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 11 | 14.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 11 | 9.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 24 | 16.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 11 | 13.25 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 24 | 22.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/6p62_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/6p62_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, pantyhose, looking_at_viewer, gun, long_sleeves, shirt, smile, thighhighs, boots, full_body, jacket, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | pantyhose | looking_at_viewer | gun | long_sleeves | shirt | smile | thighhighs | boots | full_body | jacket | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:------------|:--------------------|:------|:---------------|:--------|:--------|:-------------|:--------|:------------|:---------|:-------------------|
| 0 | 11 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/6p62_girlsfrontline | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T05:44:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T05:48:16+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of 6p62/6P62/6P62 (Girls' Frontline)
============================================
This is the dataset of 6p62/6P62/6P62 (Girls' Frontline), containing 11 images and their tags.
The core tags of this character are 'long\_hair, red\_hair, blue\_eyes, hat, breasts, large\_breasts, bangs, glasses, ponytail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
5a23e0096725e77775d84294f32ca3a2dcbc5da5 |
# Dataset Card for Evaluation run of nisten/shqiponja-59b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nisten/shqiponja-59b-v1](https://huggingface.co/nisten/shqiponja-59b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nisten__shqiponja-59b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T05:56:39.495831](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-59b-v1/blob/main/results_2024-01-14T05-56-39.495831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7432378535506855,
"acc_stderr": 0.02859899074099913,
"acc_norm": 0.7559556232571321,
"acc_norm_stderr": 0.029186017628606568,
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.7043324455434049,
"mc2_stderr": 0.014572093049489886
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.01367881039951882,
"acc_norm": 0.7005119453924915,
"acc_norm_stderr": 0.01338502163731357
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542404,
"acc_norm": 0.8405696076478789,
"acc_norm_stderr": 0.0036532880435558015
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.725925925925926,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.725925925925926,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8552631578947368,
"acc_stderr": 0.028631951845930387,
"acc_norm": 0.8552631578947368,
"acc_norm_stderr": 0.028631951845930387
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8075471698113208,
"acc_stderr": 0.024262979839372274,
"acc_norm": 0.8075471698113208,
"acc_norm_stderr": 0.024262979839372274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8819444444444444,
"acc_stderr": 0.026983346503309382,
"acc_norm": 0.8819444444444444,
"acc_norm_stderr": 0.026983346503309382
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7659574468085106,
"acc_stderr": 0.027678452578212394,
"acc_norm": 0.7659574468085106,
"acc_norm_stderr": 0.027678452578212394
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5964912280701754,
"acc_stderr": 0.04615186962583707,
"acc_norm": 0.5964912280701754,
"acc_norm_stderr": 0.04615186962583707
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7034482758620689,
"acc_stderr": 0.03806142687309992,
"acc_norm": 0.7034482758620689,
"acc_norm_stderr": 0.03806142687309992
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5846560846560847,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.5846560846560847,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6031746031746031,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.6031746031746031,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9,
"acc_stderr": 0.01706640371965727,
"acc_norm": 0.9,
"acc_norm_stderr": 0.01706640371965727
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.026544435312706463,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.026544435312706463
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9242424242424242,
"acc_stderr": 0.018852670234993093,
"acc_norm": 0.9242424242424242,
"acc_norm_stderr": 0.018852670234993093
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9585492227979274,
"acc_stderr": 0.014385432857476444,
"acc_norm": 0.9585492227979274,
"acc_norm_stderr": 0.014385432857476444
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.0199823472086373,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.0199823472086373
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476668,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8361344537815126,
"acc_stderr": 0.02404405494044049,
"acc_norm": 0.8361344537815126,
"acc_norm_stderr": 0.02404405494044049
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.040802441856289715,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.040802441856289715
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.01180036136301657,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.01180036136301657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.03191923445686185,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.03191923445686185
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9071729957805907,
"acc_stderr": 0.01888975055095671,
"acc_norm": 0.9071729957805907,
"acc_norm_stderr": 0.01888975055095671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.0277901770643836,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.0277901770643836
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622793,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622793
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8981481481481481,
"acc_stderr": 0.02923927267563275,
"acc_norm": 0.8981481481481481,
"acc_norm_stderr": 0.02923927267563275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8650306748466258,
"acc_stderr": 0.026845765054553838,
"acc_norm": 0.8650306748466258,
"acc_norm_stderr": 0.026845765054553838
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5535714285714286,
"acc_stderr": 0.047184714852195865,
"acc_norm": 0.5535714285714286,
"acc_norm_stderr": 0.047184714852195865
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9487179487179487,
"acc_stderr": 0.014450181176872726,
"acc_norm": 0.9487179487179487,
"acc_norm_stderr": 0.014450181176872726
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8991060025542784,
"acc_stderr": 0.01077047201488671,
"acc_norm": 0.8991060025542784,
"acc_norm_stderr": 0.01077047201488671
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.021152676966575266,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.021152676966575266
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7854748603351955,
"acc_stderr": 0.013728923407828853,
"acc_norm": 0.7854748603351955,
"acc_norm_stderr": 0.013728923407828853
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.0216684002565143,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.0216684002565143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8038585209003215,
"acc_stderr": 0.022552447780478026,
"acc_norm": 0.8038585209003215,
"acc_norm_stderr": 0.022552447780478026
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6170212765957447,
"acc_stderr": 0.02899908090480618,
"acc_norm": 0.6170212765957447,
"acc_norm_stderr": 0.02899908090480618
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5788787483702738,
"acc_stderr": 0.012610325733489905,
"acc_norm": 0.5788787483702738,
"acc_norm_stderr": 0.012610325733489905
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8014705882352942,
"acc_stderr": 0.024231013370541093,
"acc_norm": 0.8014705882352942,
"acc_norm_stderr": 0.024231013370541093
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.016011237996336938,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.016011237996336938
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940589,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940589
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5783132530120482,
"acc_stderr": 0.038444531817709175,
"acc_norm": 0.5783132530120482,
"acc_norm_stderr": 0.038444531817709175
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8771929824561403,
"acc_stderr": 0.02517298435015577,
"acc_norm": 0.8771929824561403,
"acc_norm_stderr": 0.02517298435015577
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5373317013463892,
"mc1_stderr": 0.017454645150970588,
"mc2": 0.7043324455434049,
"mc2_stderr": 0.014572093049489886
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050366
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.009959786220917203
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_nisten__shqiponja-59b-v1 | [
"region:us"
] | 2024-01-14T05:58:53+00:00 | {"pretty_name": "Evaluation run of nisten/shqiponja-59b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [nisten/shqiponja-59b-v1](https://huggingface.co/nisten/shqiponja-59b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nisten__shqiponja-59b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T05:56:39.495831](https://huggingface.co/datasets/open-llm-leaderboard/details_nisten__shqiponja-59b-v1/blob/main/results_2024-01-14T05-56-39.495831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7432378535506855,\n \"acc_stderr\": 0.02859899074099913,\n \"acc_norm\": 0.7559556232571321,\n \"acc_norm_stderr\": 0.029186017628606568,\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.7043324455434049,\n \"mc2_stderr\": 0.014572093049489886\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.01367881039951882,\n \"acc_norm\": 0.7005119453924915,\n \"acc_norm_stderr\": 0.01338502163731357\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n \"acc_stderr\": 0.004778081784542404,\n \"acc_norm\": 0.8405696076478789,\n \"acc_norm_stderr\": 0.0036532880435558015\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.725925925925926,\n \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.725925925925926,\n \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8552631578947368,\n \"acc_stderr\": 0.028631951845930387,\n \"acc_norm\": 0.8552631578947368,\n \"acc_norm_stderr\": 0.028631951845930387\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.8075471698113208,\n \"acc_stderr\": 0.024262979839372274,\n \"acc_norm\": 0.8075471698113208,\n \"acc_norm_stderr\": 0.024262979839372274\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8819444444444444,\n \"acc_stderr\": 0.026983346503309382,\n \"acc_norm\": 0.8819444444444444,\n \"acc_norm_stderr\": 0.026983346503309382\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7659574468085106,\n \"acc_stderr\": 0.027678452578212394,\n \"acc_norm\": 0.7659574468085106,\n \"acc_norm_stderr\": 0.027678452578212394\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5964912280701754,\n \"acc_stderr\": 0.04615186962583707,\n \"acc_norm\": 0.5964912280701754,\n \"acc_norm_stderr\": 0.04615186962583707\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.7034482758620689,\n \"acc_stderr\": 0.03806142687309992,\n \"acc_norm\": 0.7034482758620689,\n \"acc_norm_stderr\": 0.03806142687309992\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5846560846560847,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.5846560846560847,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6031746031746031,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.6031746031746031,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.01706640371965727,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.01706640371965727\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8666666666666667,\n \"acc_stderr\": 0.026544435312706463,\n \"acc_norm\": 0.8666666666666667,\n \"acc_norm_stderr\": 0.026544435312706463\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.9242424242424242,\n \"acc_stderr\": 0.018852670234993093,\n \"acc_norm\": 0.9242424242424242,\n \"acc_norm_stderr\": 0.018852670234993093\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9585492227979274,\n \"acc_stderr\": 0.014385432857476444,\n \"acc_norm\": 0.9585492227979274,\n \"acc_norm_stderr\": 0.014385432857476444\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.8076923076923077,\n \"acc_stderr\": 0.0199823472086373,\n \"acc_norm\": 0.8076923076923077,\n \"acc_norm_stderr\": 0.0199823472086373\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476668,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476668\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.8361344537815126,\n \"acc_stderr\": 0.02404405494044049,\n \"acc_norm\": 0.8361344537815126,\n \"acc_norm_stderr\": 0.02404405494044049\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.48344370860927155,\n \"acc_stderr\": 0.040802441856289715,\n \"acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.040802441856289715\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9174311926605505,\n \"acc_stderr\": 0.01180036136301657,\n \"acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.01180036136301657\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.03191923445686185,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.03191923445686185\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.9071729957805907,\n \"acc_stderr\": 0.01888975055095671,\n \"acc_norm\": 0.9071729957805907,\n \"acc_norm_stderr\": 0.01888975055095671\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n \"acc_stderr\": 0.0277901770643836,\n \"acc_norm\": 0.7802690582959642,\n \"acc_norm_stderr\": 0.0277901770643836\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622793,\n \"acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622793\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8981481481481481,\n \"acc_stderr\": 0.02923927267563275,\n \"acc_norm\": 0.8981481481481481,\n \"acc_norm_stderr\": 0.02923927267563275\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8650306748466258,\n \"acc_stderr\": 0.026845765054553838,\n \"acc_norm\": 0.8650306748466258,\n \"acc_norm_stderr\": 0.026845765054553838\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5535714285714286,\n \"acc_stderr\": 0.047184714852195865,\n \"acc_norm\": 0.5535714285714286,\n \"acc_norm_stderr\": 0.047184714852195865\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9487179487179487,\n \"acc_stderr\": 0.014450181176872726,\n \"acc_norm\": 0.9487179487179487,\n \"acc_norm_stderr\": 0.014450181176872726\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8991060025542784,\n \"acc_stderr\": 0.01077047201488671,\n \"acc_norm\": 0.8991060025542784,\n \"acc_norm_stderr\": 0.01077047201488671\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.021152676966575266,\n \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.021152676966575266\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7854748603351955,\n \"acc_stderr\": 0.013728923407828853,\n \"acc_norm\": 0.7854748603351955,\n \"acc_norm_stderr\": 0.013728923407828853\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.0216684002565143,\n \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.0216684002565143\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8038585209003215,\n \"acc_stderr\": 0.022552447780478026,\n \"acc_norm\": 0.8038585209003215,\n \"acc_norm_stderr\": 0.022552447780478026\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.6170212765957447,\n \"acc_stderr\": 0.02899908090480618,\n \"acc_norm\": 0.6170212765957447,\n \"acc_norm_stderr\": 0.02899908090480618\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5788787483702738,\n \"acc_stderr\": 0.012610325733489905,\n \"acc_norm\": 0.5788787483702738,\n \"acc_norm_stderr\": 0.012610325733489905\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.8014705882352942,\n \"acc_stderr\": 0.024231013370541093,\n \"acc_norm\": 0.8014705882352942,\n \"acc_norm_stderr\": 0.024231013370541093\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.016011237996336938,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.016011237996336938\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.04265792110940589,\n \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04265792110940589\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5783132530120482,\n \"acc_stderr\": 0.038444531817709175,\n \"acc_norm\": 0.5783132530120482,\n \"acc_norm_stderr\": 0.038444531817709175\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8771929824561403,\n \"acc_stderr\": 0.02517298435015577,\n \"acc_norm\": 0.8771929824561403,\n \"acc_norm_stderr\": 0.02517298435015577\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5373317013463892,\n \"mc1_stderr\": 0.017454645150970588,\n \"mc2\": 0.7043324455434049,\n \"mc2_stderr\": 0.014572093049489886\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050366\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \"acc_stderr\": 0.009959786220917203\n }\n}\n```", "repo_url": "https://huggingface.co/nisten/shqiponja-59b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|arc:challenge|25_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|gsm8k|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hellaswag|10_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["**/details_harness|winogrande|5_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T05-56-39.495831.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T05_56_39.495831", "path": ["results_2024-01-14T05-56-39.495831.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T05-56-39.495831.parquet"]}]}]} | 2024-01-14T05:59:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of nisten/shqiponja-59b-v1
Dataset automatically created during the evaluation run of model nisten/shqiponja-59b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T05:56:39.495831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of nisten/shqiponja-59b-v1\n\n\n\nDataset automatically created during the evaluation run of model nisten/shqiponja-59b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T05:56:39.495831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nisten/shqiponja-59b-v1\n\n\n\nDataset automatically created during the evaluation run of model nisten/shqiponja-59b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T05:56:39.495831(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
1f168f0b073b92a4af583cf865d304e50df2f4fa | Dataset collected from [PGB: A PubMed Graph Benchmark for Heterogeneous Network Representation Learning](https://arxiv.org/pdf/2305.02691.pdf)
Description :
inbound_citation: List List of PMID that cites the paper
outbound_citation: List References of the paper
PMID : Pubmed ID | bisectgroup/PMID_CITED_forKG | [
"arxiv:2305.02691",
"region:us"
] | 2024-01-14T06:01:13+00:00 | {} | 2024-01-14T08:29:27+00:00 | [
"2305.02691"
] | [] | TAGS
#arxiv-2305.02691 #region-us
| Dataset collected from PGB: A PubMed Graph Benchmark for Heterogeneous Network Representation Learning
Description :
inbound_citation: List List of PMID that cites the paper
outbound_citation: List References of the paper
PMID : Pubmed ID | [] | [
"TAGS\n#arxiv-2305.02691 #region-us \n"
] |
b917c17240b4e9e5bf9551b09f2ef29ba9c66b5a |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6191660640057981,
"acc_stderr": 0.03263652891344978,
"acc_norm": 0.6271945727055741,
"acc_norm_stderr": 0.03333445432068468,
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6580362477594105,
"acc_stderr": 0.004733980470799212,
"acc_norm": 0.8462457677753435,
"acc_norm_stderr": 0.0035997580435468044
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.02937364625323469,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.02937364625323469
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.03260038511835771,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.03260038511835771
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155243,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155243
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110932,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683512,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683512
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612907,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612907
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.031602951437766785,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.031602951437766785
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266196,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266196
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3865921787709497,
"acc_stderr": 0.016286674879101022,
"acc_norm": 0.3865921787709497,
"acc_norm_stderr": 0.016286674879101022
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914389,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914389
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.02972215209928007,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.02972215209928007
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.01922832201869664,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.01922832201869664
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.04494290866252091,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.04494290866252091
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784603,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.43329253365973075,
"mc1_stderr": 0.017347024450107492,
"mc2": 0.5997212380160826,
"mc2_stderr": 0.015696061571327326
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209408
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267797
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1 | [
"region:us"
] | 2024-01-14T06:22:39+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:20:20.648218](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v1/blob/main/results_2024-01-14T06-20-20.648218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6191660640057981,\n \"acc_stderr\": 0.03263652891344978,\n \"acc_norm\": 0.6271945727055741,\n \"acc_norm_stderr\": 0.03333445432068468,\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131167,\n \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6580362477594105,\n \"acc_stderr\": 0.004733980470799212,\n \"acc_norm\": 0.8462457677753435,\n \"acc_norm_stderr\": 0.0035997580435468044\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.02937364625323469,\n \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.02937364625323469\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.03260038511835771,\n \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.03260038511835771\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155243,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155243\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110932,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110932\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683512,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683512\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612907,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612907\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n \"acc_stderr\": 0.031602951437766785,\n \"acc_norm\": 0.6681614349775785,\n \"acc_norm_stderr\": 0.031602951437766785\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266196,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266196\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3865921787709497,\n \"acc_stderr\": 0.016286674879101022,\n \"acc_norm\": 0.3865921787709497,\n \"acc_norm_stderr\": 0.016286674879101022\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914389,\n \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914389\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.02972215209928007,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.02972215209928007\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.01922832201869664,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.01922832201869664\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.04494290866252091,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.04494290866252091\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784603,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784603\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.43329253365973075,\n \"mc1_stderr\": 0.017347024450107492,\n \"mc2\": 0.5997212380160826,\n \"mc2_stderr\": 0.015696061571327326\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209408\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \"acc_stderr\": 0.011083227665267797\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-20-20.648218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_20_20.648218", "path": ["results_2024-01-14T06-20-20.648218.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-20-20.648218.parquet"]}]}]} | 2024-01-14T06:22:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1
Dataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T06:20:20.648218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:20:20.648218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:20:20.648218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
defc20a14952a9d5a4c8f97f4ed14aeb5cd01f17 |
# Dataset of hunter/ハンター/猎人 (Azur Lane)
This is the dataset of hunter/ハンター/猎人 (Azur Lane), containing 24 images and their tags.
The core tags of this character are `hat, long_hair, red_eyes, brown_hair, bangs, blonde_hair, hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 24 | 26.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 24 | 15.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 48 | 31.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 24 | 23.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 48 | 43.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hunter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hunter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------|
| 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, tricorne, gloves, scarf, navel, shorts, belt, gun, looking_at_viewer, midriff, thighhighs, boots, jacket |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | tricorne | gloves | scarf | navel | shorts | belt | gun | looking_at_viewer | midriff | thighhighs | boots | jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------|:---------|:--------|:--------|:---------|:-------|:------|:--------------------|:----------|:-------------|:--------|:---------|
| 0 | 24 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/hunter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:38:28+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:43:41+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of hunter/ハンター/猎人 (Azur Lane)
=====================================
This is the dataset of hunter/ハンター/猎人 (Azur Lane), containing 24 images and their tags.
The core tags of this character are 'hat, long\_hair, red\_eyes, brown\_hair, bangs, blonde\_hair, hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
9f816323a3958315604cbeac9fb517753b359b39 |
# Dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane)
This is the dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `hair_ornament, short_hair, red_eyes, white_hair, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 11.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 8.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 37 | 17.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 11.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 37 | 22.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/bulldog_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/bulldog_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, solo, white_gloves, simple_background, blush, pleated_skirt, short_sleeves, white_background, looking_at_viewer, closed_mouth, white_shirt, full_body, white_skirt, black_thighhighs, brooch, thigh_strap |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_gloves | simple_background | blush | pleated_skirt | short_sleeves | white_background | looking_at_viewer | closed_mouth | white_shirt | full_body | white_skirt | black_thighhighs | brooch | thigh_strap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------------|:--------|:----------------|:----------------|:-------------------|:--------------------|:---------------|:--------------|:------------|:--------------|:-------------------|:---------|:--------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/bulldog_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:38:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:42:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane)
========================================
This is the dataset of bulldog/ブルドッグ/大斗犬 (Azur Lane), containing 17 images and their tags.
The core tags of this character are 'hair\_ornament, short\_hair, red\_eyes, white\_hair, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a0d5cb8401b1224e95d7f0762981b12b0768527b |
# Dataset Card for Evaluation run of ewqr2130/TinyLamma-SFT
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ewqr2130/TinyLamma-SFT](https://huggingface.co/ewqr2130/TinyLamma-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:47:16.082235](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT/blob/main/results_2024-01-14T06-47-16.082235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2490861506190792,
"acc_stderr": 0.030409991529850307,
"acc_norm": 0.25019487136318186,
"acc_norm_stderr": 0.031150216526222626,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.372025520096151,
"mc2_stderr": 0.013802667425788874
},
"harness|arc:challenge|25": {
"acc": 0.3174061433447099,
"acc_stderr": 0.01360223908803817,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.4475204142601075,
"acc_stderr": 0.004962220512548357,
"acc_norm": 0.5914160525791675,
"acc_norm_stderr": 0.004905674408614011
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.15789473684210525,
"acc_stderr": 0.029674167520101474,
"acc_norm": 0.15789473684210525,
"acc_norm_stderr": 0.029674167520101474
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566019,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566019
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.030299574664788147,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.030299574664788147
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617748,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617748
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412424,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412424
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708624,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708624
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.037649508797906066,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.037649508797906066
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25806451612903225,
"acc_stderr": 0.024892469172462826,
"acc_norm": 0.25806451612903225,
"acc_norm_stderr": 0.024892469172462826
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21182266009852216,
"acc_stderr": 0.028748983689941054,
"acc_norm": 0.21182266009852216,
"acc_norm_stderr": 0.028748983689941054
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.03051611137147601,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.03051611137147601
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.02075242372212801,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.02075242372212801
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868963,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868963
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804724,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804724
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.01754937638931369,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.01754937638931369
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536023,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536023
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2869198312236287,
"acc_stderr": 0.02944377302259469,
"acc_norm": 0.2869198312236287,
"acc_norm_stderr": 0.02944377302259469
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.1262135922330097,
"acc_stderr": 0.03288180278808628,
"acc_norm": 0.1262135922330097,
"acc_norm_stderr": 0.03288180278808628
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749475,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749475
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2720306513409962,
"acc_stderr": 0.015913367447500517,
"acc_norm": 0.2720306513409962,
"acc_norm_stderr": 0.015913367447500517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.23410404624277456,
"acc_stderr": 0.022797110278071134,
"acc_norm": 0.23410404624277456,
"acc_norm_stderr": 0.022797110278071134
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3022508038585209,
"acc_stderr": 0.026082700695399672,
"acc_norm": 0.3022508038585209,
"acc_norm_stderr": 0.026082700695399672
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432414,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432414
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.19852941176470587,
"acc_stderr": 0.024231013370541107,
"acc_norm": 0.19852941176470587,
"acc_norm_stderr": 0.024231013370541107
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667195,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667195
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.14285714285714285,
"acc_stderr": 0.022401787435256396,
"acc_norm": 0.14285714285714285,
"acc_norm_stderr": 0.022401787435256396
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553026,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553026
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.372025520096151,
"mc2_stderr": 0.013802667425788874
},
"harness|winogrande|5": {
"acc": 0.5864246250986582,
"acc_stderr": 0.013840971763195303
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224265
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT | [
"region:us"
] | 2024-01-14T06:49:04+00:00 | {"pretty_name": "Evaluation run of ewqr2130/TinyLamma-SFT", "dataset_summary": "Dataset automatically created during the evaluation run of model [ewqr2130/TinyLamma-SFT](https://huggingface.co/ewqr2130/TinyLamma-SFT) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:47:16.082235](https://huggingface.co/datasets/open-llm-leaderboard/details_ewqr2130__TinyLamma-SFT/blob/main/results_2024-01-14T06-47-16.082235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2490861506190792,\n \"acc_stderr\": 0.030409991529850307,\n \"acc_norm\": 0.25019487136318186,\n \"acc_norm_stderr\": 0.031150216526222626,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.372025520096151,\n \"mc2_stderr\": 0.013802667425788874\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3174061433447099,\n \"acc_stderr\": 0.01360223908803817,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156213\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4475204142601075,\n \"acc_stderr\": 0.004962220512548357,\n \"acc_norm\": 0.5914160525791675,\n \"acc_norm_stderr\": 0.004905674408614011\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.15789473684210525,\n \"acc_stderr\": 0.029674167520101474,\n \"acc_norm\": 0.15789473684210525,\n \"acc_norm_stderr\": 0.029674167520101474\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566019,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566019\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n \"acc_stderr\": 0.030299574664788147,\n \"acc_norm\": 0.19653179190751446,\n \"acc_norm_stderr\": 0.030299574664788147\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617748,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617748\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412424,\n \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412424\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708624,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708624\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.037649508797906066,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.037649508797906066\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25806451612903225,\n \"acc_stderr\": 0.024892469172462826,\n \"acc_norm\": 0.25806451612903225,\n \"acc_norm_stderr\": 0.024892469172462826\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.21182266009852216,\n \"acc_stderr\": 0.028748983689941054,\n \"acc_norm\": 0.21182266009852216,\n \"acc_norm_stderr\": 0.028748983689941054\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586815,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586815\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.03051611137147601,\n \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.03051611137147601\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.02075242372212801,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.02075242372212801\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868963,\n \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868963\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804724,\n \"acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804724\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.21284403669724772,\n \"acc_stderr\": 0.01754937638931369,\n \"acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.01754937638931369\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536023,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536023\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2869198312236287,\n \"acc_stderr\": 0.02944377302259469,\n \"acc_norm\": 0.2869198312236287,\n \"acc_norm_stderr\": 0.02944377302259469\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1262135922330097,\n \"acc_stderr\": 0.03288180278808628,\n \"acc_norm\": 0.1262135922330097,\n \"acc_norm_stderr\": 0.03288180278808628\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n \"acc_stderr\": 0.028911208802749475,\n \"acc_norm\": 0.26495726495726496,\n \"acc_norm_stderr\": 0.028911208802749475\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2720306513409962,\n \"acc_stderr\": 0.015913367447500517,\n \"acc_norm\": 0.2720306513409962,\n \"acc_norm_stderr\": 0.015913367447500517\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.23410404624277456,\n \"acc_stderr\": 0.022797110278071134,\n \"acc_norm\": 0.23410404624277456,\n \"acc_norm_stderr\": 0.022797110278071134\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3022508038585209,\n \"acc_stderr\": 0.026082700695399672,\n \"acc_norm\": 0.3022508038585209,\n \"acc_norm_stderr\": 0.026082700695399672\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432414,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432414\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.19852941176470587,\n \"acc_stderr\": 0.024231013370541107,\n \"acc_norm\": 0.19852941176470587,\n \"acc_norm_stderr\": 0.024231013370541107\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667195,\n \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667195\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.14285714285714285,\n \"acc_stderr\": 0.022401787435256396,\n \"acc_norm\": 0.14285714285714285,\n \"acc_norm_stderr\": 0.022401787435256396\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553026,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553026\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.372025520096151,\n \"mc2_stderr\": 0.013802667425788874\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5864246250986582,\n \"acc_stderr\": 0.013840971763195303\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224265\n }\n}\n```", "repo_url": "https://huggingface.co/ewqr2130/TinyLamma-SFT", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["**/details_harness|winogrande|5_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-47-16.082235.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_47_16.082235", "path": ["results_2024-01-14T06-47-16.082235.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-47-16.082235.parquet"]}]}]} | 2024-01-14T06:49:24+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of ewqr2130/TinyLamma-SFT
Dataset automatically created during the evaluation run of model ewqr2130/TinyLamma-SFT on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T06:47:16.082235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of ewqr2130/TinyLamma-SFT\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/TinyLamma-SFT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:47:16.082235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ewqr2130/TinyLamma-SFT\n\n\n\nDataset automatically created during the evaluation run of model ewqr2130/TinyLamma-SFT on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:47:16.082235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
f11af3a2f535da364d3fa988ca136b89aa859cfa |
# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.48560692489095947,
"acc_stderr": 0.03450713063212824,
"acc_norm": 0.48879741973292123,
"acc_norm_stderr": 0.03524925803152966,
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|arc:challenge|25": {
"acc": 0.5196245733788396,
"acc_stderr": 0.0146001320759471,
"acc_norm": 0.5511945392491467,
"acc_norm_stderr": 0.014534599585097664
},
"harness|hellaswag|10": {
"acc": 0.6026687910774746,
"acc_stderr": 0.004883455188908963,
"acc_norm": 0.7971519617606054,
"acc_norm_stderr": 0.004012984497778308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970275,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970275
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101737,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101737
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.38620689655172413,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.38620689655172413,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.0437588849272706,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.0437588849272706
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.028327743091561074,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.028327743091561074
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.03464881675016339,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.03464881675016339
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4564102564102564,
"acc_stderr": 0.025254485424799595,
"acc_norm": 0.4564102564102564,
"acc_norm_stderr": 0.025254485424799595
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871923,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871923
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.48739495798319327,
"acc_stderr": 0.032468167657521745,
"acc_norm": 0.48739495798319327,
"acc_norm_stderr": 0.032468167657521745
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6403669724770642,
"acc_stderr": 0.020575234660123776,
"acc_norm": 0.6403669724770642,
"acc_norm_stderr": 0.020575234660123776
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.031141447823536027,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.031141447823536027
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.679324894514768,
"acc_stderr": 0.03038193194999041,
"acc_norm": 0.679324894514768,
"acc_norm_stderr": 0.03038193194999041
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4618834080717489,
"acc_stderr": 0.03346015011973228,
"acc_norm": 0.4618834080717489,
"acc_norm_stderr": 0.03346015011973228
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138938,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138938
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.046355501356099754,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.046355501356099754
},
"harness|hendrycksTest-management|5": {
"acc": 0.6407766990291263,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.6407766990291263,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6845466155810983,
"acc_stderr": 0.016617501738763397,
"acc_norm": 0.6845466155810983,
"acc_norm_stderr": 0.016617501738763397
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5173410404624278,
"acc_stderr": 0.026902900458666647,
"acc_norm": 0.5173410404624278,
"acc_norm_stderr": 0.026902900458666647
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.028274359854894248,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.028274359854894248
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5771604938271605,
"acc_stderr": 0.02748747298087159,
"acc_norm": 0.5771604938271605,
"acc_norm_stderr": 0.02748747298087159
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.028538650028878645,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.028538650028878645
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.012150699768228553,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.012150699768228553
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.020203517280261443,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.020203517280261443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.031987615467631264,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.031987615467631264
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333334,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34761321909424725,
"mc1_stderr": 0.016670769188897303,
"mc2": 0.4888307270560722,
"mc2_stderr": 0.015123753734506709
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
},
"harness|gsm8k|5": {
"acc": 0.2608036391205459,
"acc_stderr": 0.012094252417332734
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base | [
"region:us"
] | 2024-01-14T06:50:02+00:00 | {"pretty_name": "Evaluation run of Suprit/Zhongjing-LLaMA-base", "dataset_summary": "Dataset automatically created during the evaluation run of model [Suprit/Zhongjing-LLaMA-base](https://huggingface.co/Suprit/Zhongjing-LLaMA-base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:48:13.310278](https://huggingface.co/datasets/open-llm-leaderboard/details_Suprit__Zhongjing-LLaMA-base/blob/main/results_2024-01-14T06-48-13.310278.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.48560692489095947,\n \"acc_stderr\": 0.03450713063212824,\n \"acc_norm\": 0.48879741973292123,\n \"acc_norm_stderr\": 0.03524925803152966,\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5196245733788396,\n \"acc_stderr\": 0.0146001320759471,\n \"acc_norm\": 0.5511945392491467,\n \"acc_norm_stderr\": 0.014534599585097664\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6026687910774746,\n \"acc_stderr\": 0.004883455188908963,\n \"acc_norm\": 0.7971519617606054,\n \"acc_norm_stderr\": 0.004012984497778308\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n \"acc_stderr\": 0.037940126746970275,\n \"acc_norm\": 0.4508670520231214,\n \"acc_norm_stderr\": 0.037940126746970275\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101737,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101737\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.38620689655172413,\n \"acc_stderr\": 0.04057324734419035,\n \"acc_norm\": 0.38620689655172413,\n \"acc_norm_stderr\": 0.04057324734419035\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.0437588849272706,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.0437588849272706\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5451612903225806,\n \"acc_stderr\": 0.028327743091561074,\n \"acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.028327743091561074\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6161616161616161,\n \"acc_stderr\": 0.03464881675016339,\n \"acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.03464881675016339\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4564102564102564,\n \"acc_stderr\": 0.025254485424799595,\n \"acc_norm\": 0.4564102564102564,\n \"acc_norm_stderr\": 0.025254485424799595\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871923,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871923\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.48739495798319327,\n \"acc_stderr\": 0.032468167657521745,\n \"acc_norm\": 0.48739495798319327,\n \"acc_norm_stderr\": 0.032468167657521745\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6403669724770642,\n \"acc_stderr\": 0.020575234660123776,\n \"acc_norm\": 0.6403669724770642,\n \"acc_norm_stderr\": 0.020575234660123776\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.031141447823536027,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.031141447823536027\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.679324894514768,\n \"acc_stderr\": 0.03038193194999041,\n \"acc_norm\": 0.679324894514768,\n \"acc_norm_stderr\": 0.03038193194999041\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4618834080717489,\n \"acc_stderr\": 0.03346015011973228,\n \"acc_norm\": 0.4618834080717489,\n \"acc_norm_stderr\": 0.03346015011973228\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138938,\n \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138938\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.046355501356099754,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.046355501356099754\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6407766990291263,\n \"acc_stderr\": 0.047504583990416946,\n \"acc_norm\": 0.6407766990291263,\n \"acc_norm_stderr\": 0.047504583990416946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6845466155810983,\n \"acc_stderr\": 0.016617501738763397,\n \"acc_norm\": 0.6845466155810983,\n \"acc_norm_stderr\": 0.016617501738763397\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5173410404624278,\n \"acc_stderr\": 0.026902900458666647,\n \"acc_norm\": 0.5173410404624278,\n \"acc_norm_stderr\": 0.026902900458666647\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n \"acc_stderr\": 0.028274359854894248,\n \"acc_norm\": 0.5466237942122186,\n \"acc_norm_stderr\": 0.028274359854894248\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5771604938271605,\n \"acc_stderr\": 0.02748747298087159,\n \"acc_norm\": 0.5771604938271605,\n \"acc_norm_stderr\": 0.02748747298087159\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.028538650028878645,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.028538650028878645\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.012150699768228553,\n \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.012150699768228553\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.030365446477275675,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.030365446477275675\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.020203517280261443,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.020203517280261443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.031987615467631264,\n \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.031987615467631264\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03333333333333334,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03333333333333334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34761321909424725,\n \"mc1_stderr\": 0.016670769188897303,\n \"mc2\": 0.4888307270560722,\n \"mc2_stderr\": 0.015123753734506709\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2608036391205459,\n \"acc_stderr\": 0.012094252417332734\n }\n}\n```", "repo_url": "https://huggingface.co/Suprit/Zhongjing-LLaMA-base", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-48-13.310278.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_48_13.310278", "path": ["results_2024-01-14T06-48-13.310278.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-48-13.310278.parquet"]}]}]} | 2024-01-14T06:50:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base
Dataset automatically created during the evaluation run of model Suprit/Zhongjing-LLaMA-base on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T06:48:13.310278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base\n\n\n\nDataset automatically created during the evaluation run of model Suprit/Zhongjing-LLaMA-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:48:13.310278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Suprit/Zhongjing-LLaMA-base\n\n\n\nDataset automatically created during the evaluation run of model Suprit/Zhongjing-LLaMA-base on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:48:13.310278(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
699e571c33c977be08b9ac4c584bd9600c595b7b |
# Dataset of craven/クレイヴン/克雷文 (Azur Lane)
This is the dataset of craven/クレイヴン/克雷文 (Azur Lane), containing 17 images and their tags.
The core tags of this character are `long_hair, purple_hair, drill_hair, yellow_eyes, bangs, breasts, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 17 | 14.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 17 | 10.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 36 | 20.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 17 | 13.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 36 | 25.45 MiB | [Download](https://huggingface.co/datasets/CyberHarem/craven_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/craven_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, blush, smile, looking_at_viewer, solo, open_mouth, white_thighhighs, navel, pleated_skirt, full_body, sailor_collar, school_uniform, shirt, shoes, standing, cheerleader, collarbone, long_sleeves, midriff, one_eye_closed, pom_pom_(cheerleading), white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | smile | looking_at_viewer | solo | open_mouth | white_thighhighs | navel | pleated_skirt | full_body | sailor_collar | school_uniform | shirt | shoes | standing | cheerleader | collarbone | long_sleeves | midriff | one_eye_closed | pom_pom_(cheerleading) | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------|:--------------------|:-------|:-------------|:-------------------|:--------|:----------------|:------------|:----------------|:-----------------|:--------|:--------|:-----------|:--------------|:-------------|:---------------|:----------|:-----------------|:-------------------------|:-------------------|
| 0 | 17 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/craven_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:54:58+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:58:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of craven/クレイヴン/克雷文 (Azur Lane)
=======================================
This is the dataset of craven/クレイヴン/克雷文 (Azur Lane), containing 17 images and their tags.
The core tags of this character are 'long\_hair, purple\_hair, drill\_hair, yellow\_eyes, bangs, breasts, hair\_between\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
170a5823e7cb0f7da2c7d9afe9e703b58f6c009d |
# Dataset of stephen_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane)
This is the dataset of stephen_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane), containing 21 images and their tags.
The core tags of this character are `blonde_hair, blue_eyes, hat, braid, hair_ornament, breasts, beret`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 21 | 32.12 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 21 | 16.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 54 | 36.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 21 | 28.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 54 | 55.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/stephen_potter_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/stephen_potter_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, blush, navel, pantyhose, sailor_collar, shorts, simple_background, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | blush | navel | pantyhose | sailor_collar | shorts | simple_background | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------|:--------|:------------|:----------------|:---------|:--------------------|:-------------|
| 0 | 21 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/stephen_potter_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:55:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:02:06+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of stephen\_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane)
========================================================
This is the dataset of stephen\_potter/ステフェン・ポッター/史蒂芬·波特 (Azur Lane), containing 21 images and their tags.
The core tags of this character are 'blonde\_hair, blue\_eyes, hat, braid, hair\_ornament, breasts, beret', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
2b3efafef5e1dd19be0777c370ee23b14e1421a6 |
# Dataset of nowaki/野分/野分 (Azur Lane)
This is the dataset of nowaki/野分/野分 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `ahoge, long_hair, black_hair, brown_eyes, headgear, yellow_eyes, very_long_hair, bangs, breasts, blue_ribbon, small_breasts, bow, brown_hair, hair_between_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 10.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 29 | 15.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 10.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 29 | 18.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nowaki_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nowaki_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | blush, 1girl, looking_at_viewer, solo, black_skirt, midriff, navel, pleated_skirt, collarbone, detached_sleeves, parted_lips, sailor_collar, white_shirt, crop_top, simple_background, single_thighhigh, sleeveless, white_background, bare_shoulders, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | blush | 1girl | looking_at_viewer | solo | black_skirt | midriff | navel | pleated_skirt | collarbone | detached_sleeves | parted_lips | sailor_collar | white_shirt | crop_top | simple_background | single_thighhigh | sleeveless | white_background | bare_shoulders | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:--------------------|:-------|:--------------|:----------|:--------|:----------------|:-------------|:-------------------|:--------------|:----------------|:--------------|:-----------|:--------------------|:-------------------|:-------------|:-------------------|:-----------------|:---------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/nowaki_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T06:55:24+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T06:58:21+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of nowaki/野分/野分 (Azur Lane)
===================================
This is the dataset of nowaki/野分/野分 (Azur Lane), containing 12 images and their tags.
The core tags of this character are 'ahoge, long\_hair, black\_hair, brown\_eyes, headgear, yellow\_eyes, very\_long\_hair, bangs, breasts, blue\_ribbon, small\_breasts, bow, brown\_hair, hair\_between\_eyes, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
119cf708cf90865b277aa699c8ae65d0ca2daded |
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T06:58:33.296331](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2/blob/main/results_2024-01-14T06-58-33.296331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6259237317793794,
"acc_stderr": 0.032605779344111435,
"acc_norm": 0.630765473051009,
"acc_norm_stderr": 0.0332633311611487,
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6239282651782372,
"mc2_stderr": 0.015497224514490227
},
"harness|arc:challenge|25": {
"acc": 0.6305460750853242,
"acc_stderr": 0.014104578366491894,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.013830568927974332
},
"harness|hellaswag|10": {
"acc": 0.6633140808603863,
"acc_stderr": 0.004716106475905091,
"acc_norm": 0.8490340569607648,
"acc_norm_stderr": 0.003572839969521999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520203,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520203
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.02489246917246283,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.02489246917246283
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.02708037281514566,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.02708037281514566
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8220183486238533,
"acc_stderr": 0.016399436366612893,
"acc_norm": 0.8220183486238533,
"acc_norm_stderr": 0.016399436366612893
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069436,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516304,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516304
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066293,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066293
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.02468531686725781,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.02468531686725781
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.01617569201338196,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.01617569201338196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.026256053835718964,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.026256053835718964
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4426336375488918,
"acc_stderr": 0.012685906538206242,
"acc_norm": 0.4426336375488918,
"acc_norm_stderr": 0.012685906538206242
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.02916312857067073,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.02916312857067073
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.6239282651782372,
"mc2_stderr": 0.015497224514490227
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090252
},
"harness|gsm8k|5": {
"acc": 0.39651250947687644,
"acc_stderr": 0.013474258584033352
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2 | [
"region:us"
] | 2024-01-14T07:00:54+00:00 | {"pretty_name": "Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2](https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T06:58:33.296331](https://huggingface.co/datasets/open-llm-leaderboard/details_HenryJJ__dolphin-2.6-mistral-7b-dpo-orca-v2/blob/main/results_2024-01-14T06-58-33.296331.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6259237317793794,\n \"acc_stderr\": 0.032605779344111435,\n \"acc_norm\": 0.630765473051009,\n \"acc_norm_stderr\": 0.0332633311611487,\n \"mc1\": 0.4467564259485924,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6239282651782372,\n \"mc2_stderr\": 0.015497224514490227\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6305460750853242,\n \"acc_stderr\": 0.014104578366491894,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.013830568927974332\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6633140808603863,\n \"acc_stderr\": 0.004716106475905091,\n \"acc_norm\": 0.8490340569607648,\n \"acc_norm_stderr\": 0.003572839969521999\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520203,\n \"acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520203\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n \"acc_stderr\": 0.02489246917246283,\n \"acc_norm\": 0.7419354838709677,\n \"acc_norm_stderr\": 0.02489246917246283\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097417,\n \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097417\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.27037037037037037,\n \"acc_stderr\": 0.02708037281514566,\n \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.02708037281514566\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8220183486238533,\n \"acc_stderr\": 0.016399436366612893,\n \"acc_norm\": 0.8220183486238533,\n \"acc_norm_stderr\": 0.016399436366612893\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069436,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069436\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516304,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516304\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066293,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066293\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.02468531686725781,\n \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.02468531686725781\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n \"acc_stderr\": 0.01617569201338196,\n \"acc_norm\": 0.37318435754189944,\n \"acc_norm_stderr\": 0.01617569201338196\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.026256053835718964,\n \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.026256053835718964\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4426336375488918,\n \"acc_stderr\": 0.012685906538206242,\n \"acc_norm\": 0.4426336375488918,\n \"acc_norm_stderr\": 0.012685906538206242\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.02916312857067073,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.02916312857067073\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4467564259485924,\n \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.6239282651782372,\n \"mc2_stderr\": 0.015497224514490227\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090252\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39651250947687644,\n \"acc_stderr\": 0.013474258584033352\n }\n}\n```", "repo_url": "https://huggingface.co/HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["**/details_harness|winogrande|5_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T06-58-33.296331.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T06_58_33.296331", "path": ["results_2024-01-14T06-58-33.296331.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T06-58-33.296331.parquet"]}]}]} | 2024-01-14T07:01:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2
Dataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T06:58:33.296331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:58:33.296331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2\n\n\n\nDataset automatically created during the evaluation run of model HenryJJ/dolphin-2.6-mistral-7b-dpo-orca-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T06:58:33.296331(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
58ad295e5d95c9df14c65f7b78747f46444b8cf5 |
# Dataset of halsey_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane)
This is the dataset of halsey_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane), containing 20 images and their tags.
The core tags of this character are `blue_eyes, long_hair, breasts, ahoge, grey_hair, hair_ornament, twintails, hairclip, bangs, small_breasts, very_long_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 22.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 45 | 27.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 21.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 45 | 37.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/halsey_powell_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/halsey_powell_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, solo, detached_sleeves, necktie, blush, dress, white_thighhighs, simple_background, white_background, sailor_collar, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | detached_sleeves | necktie | blush | dress | white_thighhighs | simple_background | white_background | sailor_collar | smile |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-------------------|:----------|:--------|:--------|:-------------------|:--------------------|:-------------------|:----------------|:--------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/halsey_powell_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:10:39+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:14:59+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of halsey\_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane)
=======================================================
This is the dataset of halsey\_powell/ハルゼー・パウエル/哈尔西·鲍威尔 (Azur Lane), containing 20 images and their tags.
The core tags of this character are 'blue\_eyes, long\_hair, breasts, ahoge, grey\_hair, hair\_ornament, twintails, hairclip, bangs, small\_breasts, very\_long\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
097e130d0e6c075dab7759116a35bf8e66483a56 |
# Dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane)
This is the dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane), containing 20 images and their tags.
The core tags of this character are `breasts, long_hair, large_breasts, yellow_eyes, bangs, grey_hair, hair_between_eyes, mole, mole_under_eye, parted_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 20 | 30.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 20 | 14.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 52 | 34.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 20 | 25.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 52 | 52.62 MiB | [Download](https://huggingface.co/datasets/CyberHarem/chkalov_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/chkalov_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, cleavage, solo, blush, smile, black_choker, black_gloves, black_shirt, open_clothes, white_coat, collarbone, jewelry, parted_lips, skirt, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | cleavage | solo | blush | smile | black_choker | black_gloves | black_shirt | open_clothes | white_coat | collarbone | jewelry | parted_lips | skirt | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-----------|:-------|:--------|:--------|:---------------|:---------------|:--------------|:---------------|:-------------|:-------------|:----------|:--------------|:--------|:-------------------|
| 0 | 20 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/chkalov_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:10:51+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:16:19+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane)
========================================
This is the dataset of chkalov/チカロフ/契卡洛夫 (Azur Lane), containing 20 images and their tags.
The core tags of this character are 'breasts, long\_hair, large\_breasts, yellow\_eyes, bangs, grey\_hair, hair\_between\_eyes, mole, mole\_under\_eye, parted\_bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
a4068bfca8c1e777595ce1b4f4b3d5460cef2130 |
# Dataset Card for Evaluation run of FelixChao/NinjaDolphin-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FelixChao/NinjaDolphin-7B](https://huggingface.co/FelixChao/NinjaDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:09:51.567777](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B/blob/main/results_2024-01-14T07-09-51.567777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473574572689726,
"acc_stderr": 0.03204891578067438,
"acc_norm": 0.6480292804481895,
"acc_norm_stderr": 0.03270235131918203,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5494303013121583,
"mc2_stderr": 0.015522294140989212
},
"harness|arc:challenge|25": {
"acc": 0.6186006825938567,
"acc_stderr": 0.014194389086685247,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156218
},
"harness|hellaswag|10": {
"acc": 0.6649073889663414,
"acc_stderr": 0.0047105814966393374,
"acc_norm": 0.8535152360087632,
"acc_norm_stderr": 0.0035286889976580537
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389104,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389104
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.03036037971029195,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.03036037971029195
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486547,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959614,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959614
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.012741974333897227,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.012741974333897227
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.028814722422254184,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.028814722422254184
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399683,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399683
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5494303013121583,
"mc2_stderr": 0.015522294140989212
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.6785443517816527,
"acc_stderr": 0.012864471384836703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B | [
"region:us"
] | 2024-01-14T07:12:07+00:00 | {"pretty_name": "Evaluation run of FelixChao/NinjaDolphin-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/NinjaDolphin-7B](https://huggingface.co/FelixChao/NinjaDolphin-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:09:51.567777](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__NinjaDolphin-7B/blob/main/results_2024-01-14T07-09-51.567777.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473574572689726,\n \"acc_stderr\": 0.03204891578067438,\n \"acc_norm\": 0.6480292804481895,\n \"acc_norm_stderr\": 0.03270235131918203,\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5494303013121583,\n \"mc2_stderr\": 0.015522294140989212\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6186006825938567,\n \"acc_stderr\": 0.014194389086685247,\n \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156218\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6649073889663414,\n \"acc_stderr\": 0.0047105814966393374,\n \"acc_norm\": 0.8535152360087632,\n \"acc_norm_stderr\": 0.0035286889976580537\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389104,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389104\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.03036037971029195,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.03036037971029195\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n \"acc_stderr\": 0.01611523550486547,\n \"acc_norm\": 0.3664804469273743,\n \"acc_norm_stderr\": 0.01611523550486547\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959614,\n \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959614\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n \"acc_stderr\": 0.012741974333897227,\n \"acc_norm\": 0.4667535853976532,\n \"acc_norm_stderr\": 0.012741974333897227\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.028814722422254184,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.028814722422254184\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399683,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399683\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5494303013121583,\n \"mc2_stderr\": 0.015522294140989212\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6785443517816527,\n \"acc_stderr\": 0.012864471384836703\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/NinjaDolphin-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["**/details_harness|winogrande|5_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-09-51.567777.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_09_51.567777", "path": ["results_2024-01-14T07-09-51.567777.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-09-51.567777.parquet"]}]}]} | 2024-01-14T07:12:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/NinjaDolphin-7B
Dataset automatically created during the evaluation run of model FelixChao/NinjaDolphin-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T07:09:51.567777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FelixChao/NinjaDolphin-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/NinjaDolphin-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:09:51.567777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/NinjaDolphin-7B\n\n\n\nDataset automatically created during the evaluation run of model FelixChao/NinjaDolphin-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:09:51.567777(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
d2e7f5c27b7d5058a0e4dc638e2a3b68adadd10c |
# Dataset Card for Evaluation run of jan-hq/stealth-v1.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.2](https://huggingface.co/jan-hq/stealth-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:26:57.769050](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.2/blob/main/results_2024-01-14T07-26-57.769050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.647630716246248,
"acc_stderr": 0.032067729447727726,
"acc_norm": 0.64733509976457,
"acc_norm_stderr": 0.032732372289814314,
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5422529201207911,
"mc2_stderr": 0.01526615960034381
},
"harness|arc:challenge|25": {
"acc": 0.6331058020477816,
"acc_stderr": 0.014084133118104296,
"acc_norm": 0.6638225255972696,
"acc_norm_stderr": 0.013804855026205761
},
"harness|hellaswag|10": {
"acc": 0.6748655646285601,
"acc_stderr": 0.004674677287148613,
"acc_norm": 0.8613821947819159,
"acc_norm_stderr": 0.003448410595239921
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083018,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507337,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507337
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741622,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741622
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40670391061452515,
"acc_stderr": 0.01642881191589886,
"acc_norm": 0.40670391061452515,
"acc_norm_stderr": 0.01642881191589886
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042114,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653354,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653354
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3769889840881273,
"mc1_stderr": 0.01696551757893035,
"mc2": 0.5422529201207911,
"mc2_stderr": 0.01526615960034381
},
"harness|winogrande|5": {
"acc": 0.8074191002367798,
"acc_stderr": 0.011082538847491904
},
"harness|gsm8k|5": {
"acc": 0.7225170583775588,
"acc_stderr": 0.012333447581047525
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jan-hq__stealth-v1.2 | [
"region:us"
] | 2024-01-14T07:29:20+00:00 | {"pretty_name": "Evaluation run of jan-hq/stealth-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.2](https://huggingface.co/jan-hq/stealth-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:26:57.769050](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.2/blob/main/results_2024-01-14T07-26-57.769050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.647630716246248,\n \"acc_stderr\": 0.032067729447727726,\n \"acc_norm\": 0.64733509976457,\n \"acc_norm_stderr\": 0.032732372289814314,\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5422529201207911,\n \"mc2_stderr\": 0.01526615960034381\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6331058020477816,\n \"acc_stderr\": 0.014084133118104296,\n \"acc_norm\": 0.6638225255972696,\n \"acc_norm_stderr\": 0.013804855026205761\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6748655646285601,\n \"acc_stderr\": 0.004674677287148613,\n \"acc_norm\": 0.8613821947819159,\n \"acc_norm_stderr\": 0.003448410595239921\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083018,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083018\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507337,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507337\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741622,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741622\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40670391061452515,\n \"acc_stderr\": 0.01642881191589886,\n \"acc_norm\": 0.40670391061452515,\n \"acc_norm_stderr\": 0.01642881191589886\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042114,\n \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653354,\n \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653354\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3769889840881273,\n \"mc1_stderr\": 0.01696551757893035,\n \"mc2\": 0.5422529201207911,\n \"mc2_stderr\": 0.01526615960034381\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8074191002367798,\n \"acc_stderr\": 0.011082538847491904\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7225170583775588,\n \"acc_stderr\": 0.012333447581047525\n }\n}\n```", "repo_url": "https://huggingface.co/jan-hq/stealth-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["**/details_harness|winogrande|5_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-26-57.769050.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_26_57.769050", "path": ["results_2024-01-14T07-26-57.769050.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-26-57.769050.parquet"]}]}]} | 2024-01-14T07:29:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jan-hq/stealth-v1.2
Dataset automatically created during the evaluation run of model jan-hq/stealth-v1.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T07:26:57.769050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jan-hq/stealth-v1.2\n\n\n\nDataset automatically created during the evaluation run of model jan-hq/stealth-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:26:57.769050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jan-hq/stealth-v1.2\n\n\n\nDataset automatically created during the evaluation run of model jan-hq/stealth-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:26:57.769050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
0398aa0d6dad0e29d4e2047005b11a58d54c7f0d |
# Dataset of cavalla/カヴァラ/棘鳍 (Azur Lane)
This is the dataset of cavalla/カヴァラ/棘鳍 (Azur Lane), containing 13 images and their tags.
The core tags of this character are `long_hair, blonde_hair, ponytail, breasts, small_breasts, bangs, fang`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 13 | 22.64 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 13 | 11.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 32 | 23.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 13 | 18.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 32 | 34.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cavalla_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cavalla_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, looking_at_viewer, smile, open_mouth, blush, solo, bare_shoulders, holding |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | smile | open_mouth | blush | solo | bare_shoulders | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------|:-------------|:--------|:-------|:-----------------|:----------|
| 0 | 13 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X |
| CyberHarem/cavalla_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:31:04+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:37:20+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cavalla/カヴァラ/棘鳍 (Azur Lane)
======================================
This is the dataset of cavalla/カヴァラ/棘鳍 (Azur Lane), containing 13 images and their tags.
The core tags of this character are 'long\_hair, blonde\_hair, ponytail, breasts, small\_breasts, bangs, fang', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
8988a7f752da125d9409d5d4f36ad8af605b597d |
# Dataset Card for Evaluation run of jan-hq/stealth-v1.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jan-hq__stealth-v1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-14T07:33:07.818995](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-01-14T07-33-07.818995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6295492975801164,
"acc_stderr": 0.032574778382655614,
"acc_norm": 0.631127878406581,
"acc_norm_stderr": 0.033231725904867095,
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.591209574646901,
"mc2_stderr": 0.015611059031702696
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179342
},
"harness|hellaswag|10": {
"acc": 0.6489743079067914,
"acc_stderr": 0.004763155068744876,
"acc_norm": 0.844353714399522,
"acc_norm_stderr": 0.003617787934747749
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509476,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509476
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794086,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794086
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876164
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.016125543823552947,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.016125543823552947
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.026787453111906504,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.026787453111906504
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4745762711864407,
"acc_stderr": 0.012753716929101004,
"acc_norm": 0.4745762711864407,
"acc_norm_stderr": 0.012753716929101004
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6397058823529411,
"acc_stderr": 0.029163128570670733,
"acc_norm": 0.6397058823529411,
"acc_norm_stderr": 0.029163128570670733
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6290849673202614,
"acc_stderr": 0.01954210156485412,
"acc_norm": 0.6290849673202614,
"acc_norm_stderr": 0.01954210156485412
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4173806609547124,
"mc1_stderr": 0.017262891063272178,
"mc2": 0.591209574646901,
"mc2_stderr": 0.015611059031702696
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090254
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.013428382481274252
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_jan-hq__stealth-v1.3 | [
"region:us"
] | 2024-01-14T07:35:31+00:00 | {"pretty_name": "Evaluation run of jan-hq/stealth-v1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [jan-hq/stealth-v1.3](https://huggingface.co/jan-hq/stealth-v1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jan-hq__stealth-v1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2024-01-14T07:33:07.818995](https://huggingface.co/datasets/open-llm-leaderboard/details_jan-hq__stealth-v1.3/blob/main/results_2024-01-14T07-33-07.818995.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6295492975801164,\n \"acc_stderr\": 0.032574778382655614,\n \"acc_norm\": 0.631127878406581,\n \"acc_norm_stderr\": 0.033231725904867095,\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.591209574646901,\n \"mc2_stderr\": 0.015611059031702696\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179342\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6489743079067914,\n \"acc_stderr\": 0.004763155068744876,\n \"acc_norm\": 0.844353714399522,\n \"acc_norm_stderr\": 0.003617787934747749\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411022,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411022\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509476,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509476\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.768595041322314,\n \"acc_stderr\": 0.03849856098794086,\n \"acc_norm\": 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794086\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n \"acc_stderr\": 0.013890862162876164,\n \"acc_norm\": 0.8148148148148148,\n \"acc_norm_stderr\": 0.013890862162876164\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n \"acc_stderr\": 0.016125543823552947,\n \"acc_norm\": 0.3675977653631285,\n \"acc_norm_stderr\": 0.016125543823552947\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.026787453111906504,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.026787453111906504\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4745762711864407,\n \"acc_stderr\": 0.012753716929101004,\n \"acc_norm\": 0.4745762711864407,\n \"acc_norm_stderr\": 0.012753716929101004\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6397058823529411,\n \"acc_stderr\": 0.029163128570670733,\n \"acc_norm\": 0.6397058823529411,\n \"acc_norm_stderr\": 0.029163128570670733\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6290849673202614,\n \"acc_stderr\": 0.01954210156485412,\n \"acc_norm\": 0.6290849673202614,\n \"acc_norm_stderr\": 0.01954210156485412\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4173806609547124,\n \"mc1_stderr\": 0.017262891063272178,\n \"mc2\": 0.591209574646901,\n \"mc2_stderr\": 0.015611059031702696\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090254\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.013428382481274252\n }\n}\n```", "repo_url": "https://huggingface.co/jan-hq/stealth-v1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet", "**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["**/details_harness|winogrande|5_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2024-01-14T07-33-07.818995.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2024_01_14T07_33_07.818995", "path": ["results_2024-01-14T07-33-07.818995.parquet"]}, {"split": "latest", "path": ["results_2024-01-14T07-33-07.818995.parquet"]}]}]} | 2024-01-14T07:35:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of jan-hq/stealth-v1.3
Dataset automatically created during the evaluation run of model jan-hq/stealth-v1.3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2024-01-14T07:33:07.818995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of jan-hq/stealth-v1.3\n\n\n\nDataset automatically created during the evaluation run of model jan-hq/stealth-v1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:33:07.818995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jan-hq/stealth-v1.3\n\n\n\nDataset automatically created during the evaluation run of model jan-hq/stealth-v1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2024-01-14T07:33:07.818995(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] |
b46d603d5e61be6b279806bfe4fce69dea17bf07 |
# Dataset of kako/加古/加古 (Azur Lane)
This is the dataset of kako/加古/加古 (Azur Lane), containing 12 images and their tags.
The core tags of this character are `braid, brown_hair, long_hair, glasses, semi-rimless_eyewear, twin_braids, under-rim_eyewear, red-framed_eyewear, animal_ears, breasts, large_breasts, aqua_eyes, bangs, between_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 12 | 8.63 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 12 | 7.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 25 | 13.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 12 | 8.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 25 | 14.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kako_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kako_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | sailor_collar, 1girl, solo, pleated_skirt, crop_top, detached_sleeves, looking_at_viewer, neckerchief, retrofit_(azur_lane), black_skirt, midriff, closed_mouth, sleeveless_shirt, white_gloves, white_thighhighs, wide_sleeves, blush, miniskirt, navel, adjusting_eyewear, bare_shoulders, serafuku, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | sailor_collar | 1girl | solo | pleated_skirt | crop_top | detached_sleeves | looking_at_viewer | neckerchief | retrofit_(azur_lane) | black_skirt | midriff | closed_mouth | sleeveless_shirt | white_gloves | white_thighhighs | wide_sleeves | blush | miniskirt | navel | adjusting_eyewear | bare_shoulders | serafuku | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------|:--------|:-------|:----------------|:-----------|:-------------------|:--------------------|:--------------|:-----------------------|:--------------|:----------|:---------------|:-------------------|:---------------|:-------------------|:---------------|:--------|:------------|:--------|:--------------------|:-----------------|:-----------|:--------------------|
| 0 | 12 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
| CyberHarem/kako_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2024-01-14T07:35:56+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2024-01-14T07:38:49+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of kako/加古/加古 (Azur Lane)
=================================
This is the dataset of kako/加古/加古 (Azur Lane), containing 12 images and their tags.
The core tags of this character are 'braid, brown\_hair, long\_hair, glasses, semi-rimless\_eyewear, twin\_braids, under-rim\_eyewear, red-framed\_eyewear, animal\_ears, breasts, large\_breasts, aqua\_eyes, bangs, between\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
| [
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |