AI Ecosystem Project
This project contains datasets and analysis tools for exploring the AI ecosystem, particularly focusing on models available on HuggingFace.
Setup
Virtual Environment
A Python virtual environment has been created with all necessary dependencies installed.
To activate the virtual environment:
Option 1: Use the activation script
./activate_env.sh
Option 2: Manual activation
source venv/bin/activate
To deactivate:
deactivate
Dependencies
The following packages are installed in the virtual environment:
Core Data Science
numpy>=1.24.0- Numerical computingpandas>=2.0.0- Data manipulation and analysisseaborn>=0.12.0- Statistical data visualizationmatplotlib>=3.7.0- Plotting library
Jupyter and Development
jupyter>=1.0.0- Jupyter notebook environmentipykernel>=6.0.0- Python kernel for Jupyternotebook>=6.5.0- Classic Jupyter notebook
Data Processing and Analysis
scikit-learn>=1.3.0- Machine learning libraryscipy>=1.10.0- Scientific computing
Visualization
plotly>=5.15.0- Interactive plottingbokeh>=3.0.0- Interactive visualization
AI/ML Specific
torch>=2.0.0- PyTorch deep learning frameworktransformers>=4.30.0- HuggingFace transformers librarydatasets>=2.12.0- HuggingFace datasets library
Utilities
json5>=0.9.0- JSON handlingtqdm>=4.65.0- Progress barsrequests>=2.31.0- HTTP librarypython-dotenv>=1.0.0- Environment variable management
Usage
Running Jupyter Notebooks
Activate the virtual environment:
./activate_env.shStart Jupyter Notebook:
jupyter notebookOr start Jupyter Lab:
jupyter lab
Project Files
get_expanded_dataset.ipynb- Notebook for expanding JSON datasets into tabular formatai_ecosystem_jsons.csv- Original JSON datasetai_ecosystem_dataset copy.csv- Expanded dataset copyai_ecosystem_withmodelcards copy.csv- Dataset with model cards
Notes
- The virtual environment is located in the
venv/directory - All dependencies are specified in
requirements.txt - Use the activation script for convenience when starting work on this project