Dataset Viewer
The dataset viewer is not available for this subset.
Cannot get the split names for the config 'default' of the dataset.
Exception:    SplitsNotFoundError
Message:      The split names could not be parsed from the dataset config.
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 289, in get_dataset_config_info
                  for split_generator in builder._split_generators(
                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/folder_based_builder/folder_based_builder.py", line 237, in _split_generators
                  raise ValueError(
              ValueError: `file_name` or `*_file_name` must be present as dictionary key (with type string) in metadata files
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
                  for split in get_dataset_split_names(
                               ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 343, in get_dataset_split_names
                  info = get_dataset_config_info(
                         ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 294, in get_dataset_config_info
                  raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
              datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

Shape Polygons Dataset

A synthetic dataset containing 70,000 images of various colored polygons (triangles to octagons) rendered on black backgrounds.

Dataset Description

This dataset consists of programmatically generated polygon images with full metadata about each shape's properties. It's designed for tasks such as:

  • Shape Classification: Classify polygons by number of vertices (3-8)
  • Regression Tasks: Predict shape properties (size, angle, position, color)
  • Object Detection: Locate and identify shapes within images
  • Generative Models: Train models to generate geometric shapes

Dataset Statistics

Split Number of Images
Train 60,000
Test 10,000
Total 70,000

Shape Types

The dataset includes 6 different polygon types:

  • Triangle (3 vertices)
  • Quadrilateral (4 vertices)
  • Pentagon (5 vertices)
  • Hexagon (6 vertices)
  • Heptagon (7 vertices)
  • Octagon (8 vertices)

Dataset Structure

shape-polygons-dataset/
β”œβ”€β”€ train/
β”‚   β”œβ”€β”€ images/
β”‚   β”‚   β”œβ”€β”€ 00001.png
β”‚   β”‚   β”œβ”€β”€ 00002.png
β”‚   β”‚   └── ... (60,000 images)
β”‚   └── metadata.csv
β”œβ”€β”€ test/
β”‚   β”œβ”€β”€ images/
β”‚   β”‚   β”œβ”€β”€ 00001.png
β”‚   β”‚   β”œβ”€β”€ 00002.png
β”‚   β”‚   └── ... (10,000 images)
β”‚   └── metadata.csv
└── README.md

Metadata Fields

Each metadata.csv contains the following columns:

Column Type Description
filename string Image filename (e.g., "00001.png")
size float Relative size of the polygon (0.0 - 1.0)
angle float Rotation angle in degrees (0.0 - 360.0)
vertices int Number of vertices (3-8)
center_x float X-coordinate of center (0.0 - 1.0, normalized)
center_y float Y-coordinate of center (0.0 - 1.0, normalized)
color_r float Red color component (0.0 - 1.0)
color_g float Green color component (0.0 - 1.0)
color_b float Blue color component (0.0 - 1.0)

Sample Images

Here are some example images from the dataset:

Sample 1 Sample 2 Sample 3 Sample 4

Usage

Loading with Hugging Face Datasets

from datasets import load_dataset

# Load the dataset
dataset = load_dataset("your-username/shape-polygons-dataset")

# Access train and test splits
train_data = dataset["train"]
test_data = dataset["test"]

# Get a sample
sample = train_data[0]
print(f"Vertices: {sample['vertices']}, Size: {sample['size']:.2f}")

Loading with Pandas

import pandas as pd
from PIL import Image
import os

# Load metadata
train_metadata = pd.read_csv("train/metadata.csv")
test_metadata = pd.read_csv("test/metadata.csv")

# Load an image
img_path = os.path.join("train/images", train_metadata.iloc[0]["filename"])
image = Image.open(img_path)
image.show()

# Filter by number of vertices (e.g., triangles only)
triangles = train_metadata[train_metadata["vertices"] == 3]
print(f"Number of triangles: {len(triangles)}")

PyTorch DataLoader Example

import torch
from torch.utils.data import Dataset, DataLoader
from PIL import Image
import pandas as pd
import os

class PolygonDataset(Dataset):
    def __init__(self, root_dir, split="train", transform=None):
        self.root_dir = root_dir
        self.split = split
        self.transform = transform
        self.metadata = pd.read_csv(os.path.join(root_dir, split, "metadata.csv"))
    
    def __len__(self):
        return len(self.metadata)
    
    def __getitem__(self, idx):
        row = self.metadata.iloc[idx]
        img_path = os.path.join(self.root_dir, self.split, "images", row["filename"])
        image = Image.open(img_path).convert("RGB")
        
        if self.transform:
            image = self.transform(image)
        
        # Number of vertices as classification label (0-5 for 3-8 vertices)
        label = row["vertices"] - 3
        
        return image, label

# Create dataset and dataloader
dataset = PolygonDataset("path/to/dataset", split="train")
dataloader = DataLoader(dataset, batch_size=32, shuffle=True)

Use Cases

  1. Beginner-Friendly ML Projects: Simple dataset for learning image classification
  2. Shape Recognition Systems: Training models to identify geometric shapes
  3. Property Regression: Predicting continuous values (size, angle, position)
  4. Multi-Task Learning: Combining classification and regression objectives
  5. Data Augmentation Research: Studying effects of synthetic data on model performance
  6. Benchmark Dataset: Evaluating new architectures on a controlled, balanced dataset

License

This dataset is released under the MIT License.

Citation

If you use this dataset in your research, please cite it as:

@dataset{shape_polygons_dataset,
  title={Shape Polygons Dataset},
  year={2024},
  url={https://huggingface.co/datasets/your-username/shape-polygons-dataset},
  note={A synthetic dataset of 70,000 polygon images for computer vision tasks}
}

Contributing

Contributions are welcome! Feel free to:

  • Report issues
  • Suggest improvements
  • Submit pull requests

Contact

For questions or feedback, please open an issue on the repository.

Downloads last month
1,236