Datasets:

Modalities:
Image
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
License:
SPARK / README.md
topyun's picture
Update README.md
4a729ae verified
|
raw
history blame
2.05 kB
---
license: apache-2.0
size_categories:
- 1K<n<10K
dataset_info:
features:
- name: id
dtype: int32
- name: image
dtype: image
- name: sensor_type
dtype: string
- name: question_type
dtype: string
- name: question
dtype: string
- name: question_query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 1455392605.0
num_examples: 6248
download_size: 903353168
dataset_size: 1455392605.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# SPARK (multi-vision Sensor Perception And Reasoning benchmarK)
<!-- Provide a quick summary of the dataset. -->
SPARK can reduce the fundamental multi-vision sensor information gap between images and multi-vision sensors. We generated 6,248 vision-language test samples automatically to investigate multi-vision sensory perception and multi-vision sensory reasoning on physical sensor knowledge proficiency across different formats, covering different types of sensor-related questions.
## Dataset Details
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
### Source Data
#### Data Collection and Processing
These instructions are built from five public datasets: [MS-COCO](https://arxiv.org/abs/1405.0312), [M3FD](https://arxiv.org/abs/2203.16220v1), [Dog&People](https://public.roboflow.com/object-detection/thermal-dogs-and-people), [RGB-D scene dataset](https://arxiv.org/abs/2110.11590), and [UNIFESP X-ray Body Part Classifier Competition dataset](https://www.kaggle.com/competitions/unifesp-x-ray-body-part-classifier).
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Contact
[SangYun Chung](https://sites.google.com/view/sang-yun-chung/profile): [email protected]