Datasets:
Tasks:
Image Segmentation
Modalities:
Image
Formats:
parquet
Sub-tasks:
semantic-segmentation
Size:
1K - 10K
License:
How to compress images in parquet.
#3
by
Divelix
- opened
Hi, thank you for open sourcing this dataset! When I tried to extract images from your parquet file, I was amazed by decompression ratio: parquet is 324 Mb while output png images took around 2.4 GB of space on disk!
I found your discussion on forum (/static-proxy?url=https%3A%2F%2Fdiscuss.huggingface.co%2Ft%2Fimage-dataset-best-practices%2F13974%3C%2Fa%3E)%2C but when I tried dataset.map()
trick I could get only parquet file of the same size as original data. Can you explain please how did you manage to achieve such high compression rate?
@Divelix how are you extracting this parquet file, any relevant code and modules you used? Please share. thank you!