Spaces:
Sleeping
Sleeping
File size: 12,761 Bytes
27a520e d25872e 27a520e d25872e 27a520e d25872e 27a520e 64d9f8d 27a520e 47e17b2 9c87a4e 47e17b2 b87b5e3 47e17b2 77b1574 9c87a4e 77b1574 9c87a4e d25872e |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 |
---
title: Prashnotri Question Generator
emoji: π
colorFrom: blue
colorTo: indigo
sdk: gradio
sdk_version: "5.29.1"
app_file: gradio_app.py
pinned: false
license: mit
---
# Prashnotri β AI Question Generator from PDF
Generate high-quality, context-aware questions from your study materials using Azure OpenAI and LangChain. This app provides a Gradio web interface for uploading PDFs and generating questions tailored to your needs.
---
## π Features
- Upload a PDF and generate questions based on its content
- Customize by subject, class, topic, difficulty, Bloom's level, and more
- Uses Azure OpenAI for both embeddings and question generation
- Feedback-driven improvement system
- Ready for Hugging Face Spaces or local deployment
---
## β‘ Quick Start
### On Hugging Face Spaces
1. **Upload the repository** to your Hugging Face Space.
2. **Set the following environment variables** in your Space's Settings > Repository secrets:
- `AZURE_OPENAI_API_KEY` β Your Azure OpenAI API key
- `AZURE_OPENAI_ENDPOINT` β Your Azure OpenAI endpoint (e.g. `https://your-resource.openai.azure.com/`)
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT` β Embedding deployment name
- `AZURE_OPENAI_CHAT_DEPLOYMENT` β Chat deployment name
- `AZURE_OPENAI_API_VERSION` β API version (e.g. `2023-05-15`)
3. The app will launch automatically. Use the web interface to upload a PDF and generate questions.
### Local Development
1. **Clone the repository** and install dependencies:
```bash
pip install -r requirements.txt
```
2. **Create a `.env` file** in the project root:
```
AZURE_OPENAI_API_KEY=your-azure-openai-key
AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com/
AZURE_OPENAI_EMBEDDING_DEPLOYMENT=your-embedding-deployment
AZURE_OPENAI_CHAT_DEPLOYMENT=your-chat-deployment
AZURE_OPENAI_API_VERSION=2023-05-15
```
3. **Run the app:**
```bash
python gradio_app.py
```
4. Open your browser to [http://127.0.0.1:7860](http://127.0.0.1:7860)
---
## π Environment Variables Explained
| Variable | Description |
|----------------------------------|---------------------------------------------|
| AZURE_OPENAI_API_KEY | Your Azure OpenAI API key |
| AZURE_OPENAI_ENDPOINT | Your Azure OpenAI endpoint URL |
| AZURE_OPENAI_EMBEDDING_DEPLOYMENT| Embedding deployment name (for embeddings) |
| AZURE_OPENAI_CHAT_DEPLOYMENT | Chat deployment name (for chat/completions) |
| AZURE_OPENAI_API_VERSION | API version (e.g. 2023-05-15) |
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT` and `AZURE_OPENAI_CHAT_DEPLOYMENT` are used as the `deployment_name` argument in the code.
- Only `AZURE_OPENAI_API_VERSION` is required for both embeddings and chat.
- **Do not** set `deployment_name` inside `model_kwargs`βit is passed as a top-level argument.
**Example usage in code:**
```python
AzureOpenAIEmbeddings(
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version=os.environ["AZURE_OPENAI_API_VERSION"],
deployment_name=os.environ["AZURE_OPENAI_EMBEDDING_DEPLOYMENT"]
)
AzureChatOpenAI(
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version=os.environ["AZURE_OPENAI_API_VERSION"],
deployment_name=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"]
)
```
---
## π οΈ Troubleshooting
- **Missing environment variables:** The app will not start if any required variable is missing.
- **502/Bad Gateway (on Spaces):** Ensure all secrets are set and valid.
- **SSL/Endpoint errors:** Double-check your Azure endpoint URL and API version.
- **Deprecation warnings:** This app is up to date with the latest LangChain and OpenAI SDK requirements.
---
## π¬ Support & Feedback
- For issues, open a GitHub issue or contact the maintainer.
- For deployment or feature requests, reach out via email or the project's contact form.
---
**Do not upload your `.env` file to any public repository.**
# Question Generator from PDF (LangChain)
## π Running on Hugging Face Spaces
This project includes a Gradio interface for generating questions from PDF study materials using LangChain and Azure OpenAI. You can deploy it directly to [Hugging Face Spaces](https://huggingface.co/spaces):
### Requirements
- Python 3.8+
- The following dependencies (see `requirements.txt`):
- gradio
- huggingface_hub
- openai
- langchain, langchain-community, langchain-core, langchain-openai, langchain-text-splitters
- faiss-cpu
- pypdf
- and others
### How to launch
1. Upload the repository to your Hugging Face Space.
2. Ensure your **Azure OpenAI** environment variables are set in the Space's secrets or environment variables:
- `AZURE_OPENAI_API_KEY`
- `AZURE_OPENAI_ENDPOINT`
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT` (embedding deployment name, used as deployment_name in code)
- `AZURE_OPENAI_CHAT_DEPLOYMENT` (chat deployment name, used as deployment_name in code)
- `AZURE_OPENAI_API_VERSION` (API version for both embeddings and chat)
3. The Space will automatically run `gradio_app.py`.
4. Use the web interface to upload a PDF and enter topic details to generate questions.
---
Absolutely! Here's a comprehensive documentation for your current project, covering deployment, usage, architecture, and future plans. You can use this as your README.md or internal documentation.
Prashnotri β AI Question Paper Generator
Overview
Prashnotri is an AI-powered platform for generating high-quality, customized question papers. It leverages Azure OpenAI for question generation, supports feedback-driven improvement, and stores generated papers and user uploads securely on AWS S3. The app is built with a Python Flask backend (optional for local/server deployment), React frontend, and MongoDB for data storage.
Features
Generate question papers based on subject, class, topic, difficulty, Bloom's level, and intelligence type/subtype.
Store and serve generated PDFs from AWS S3.
User feedback system for continuous improvement.
Secure, scalable deployment on AWS EC2 with Nginx and HTTPS.
(Planned) Upload notes as PDF/images for context-aware question generation.
Tech Stack
Backend: Python (Flask, optional), Azure OpenAI API, MongoDB, AWS S3, Boto3
Frontend: React (TypeScript)
Deployment: Ubuntu EC2, Nginx, PM2 (or systemd), Certbot (Let's Encrypt SSL)
Other: Python virtualenv, dotenv for secrets
Directory Structure
Apply to README.md
Environment Variables (.env)
Apply to README.md
Deployment Steps
1. Server Setup
Launch Ubuntu EC2 instance.
Open ports 22 (SSH), 80 (HTTP), 443 (HTTPS) in security group.
2. Install Dependencies
Apply to README.md
Run
3. Clone and Set Up Project
Apply to README.md
Run
4. Configure Environment
Create .env file with your secrets (see above for Azure OpenAI variables).
5. Frontend Build
Build your React frontend locally:
Apply to README.md
Run
Copy the build output to the dist/ folder on your server.
6. Nginx Setup
Create /etc/nginx/sites-available/prashnotri.com:
Apply to README.md
Enable site and restart Nginx:
Apply to README.md
Run
7. SSL with Certbot
Apply to README.md
Run
8. Run Flask App with PM2 (only if using Flask backend)
Apply to README.md
Run
Usage
Access the app:
Go to https://prashnotri.com in your browser.
Fill in the form:
Subject, class, language, topics, question type, difficulty, Bloom's level, intelligence type, and subtype.
Add additional instructions if needed.
Submit:
The app generates a question paper and provides a PDF download link.
Feedback:
Users can submit feedback on generated papers for future improvement.
Intelligence Types & Subtypes
| Intelligence | Refined SubTypes (Examples) |
| -------------- | --------------------------------------------------------------- |
| Logical | Pattern solving, Deductive reasoning, Coding logic, Data interpretation |
| Linguistic | Storytelling, Persuasive argument, Vocabulary building, Creative writing |
| Kinesthetic | Gross motor (e.g., sports), Fine motor (e.g., drawing), Simulations |
| Spatial | 3D visualization, Map reading, Mental rotation, Blueprint understanding |
| Musical | Rhythm patterns, Composition, Tone recognition |
| Interpersonal | Negotiation skills, Group collaboration, Empathy exercises |
| Intrapersonal | Self-assessment, Reflective writing, Goal setting |
| Naturalistic | Classification tasks, Field observations, Environmental problem-solving |
Prompt Engineering
Prompts are detailed, with strict requirements for question quality, format, and explanation.
Example and "Do Not" sections are included for clarity.
Feedback from users is incorporated into future prompt generations.
Feedback & Improvement Plan
Collect user feedback on question quality.
Iteratively refine prompts and add features (PDF upload, analytics, etc.).
Monitor logs and performance for stability.
Prepare for public launch after 1β2 months of testing and improvement.
Future Features
PDF/image upload for context-aware question generation.
Analytics dashboard for usage and feedback.
Enhanced user management and authentication.
More advanced feedback integration and prompt tuning.
Troubleshooting
502 Bad Gateway: Ensure Flask app is running and PM2 is managing it.
SSL Issues: Check Certbot and Nginx config.
App not running after reboot: Make sure pm2 save and pm2 startup were run.
Logs:
Flask: pm2 logs prashnotri
Nginx: sudo tail -n 50 /var/log/nginx/error.log
Contact & Support
For issues, open a GitHub issue or contact the maintainer.
For deployment or feature requests, reach out via email or the project's contact form.
Congratulations on your deployment!
You are set up for iterative improvement and a successful public launch.
Let me know if you want this as a markdown file, or if you want to add/change any section!
cd C:\Users\Abhishek\Desktop\pem_work
ssh -i "abhi.pem" [email protected]
65.0.135.47
#feedback system in frontend
#note taking feature are important
#issue with the falsk async options
#installation correct verisons
## Configuration
This app requires several Azure OpenAI environment variables to be set for proper operation. You can configure these in two ways:
### 1. Local Development (.env file)
Create a file named `.env` in your project root with the following content (replace values with your actual Azure OpenAI credentials):
```
AZURE_OPENAI_API_KEY=your-azure-openai-key
AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/
AZURE_OPENAI_EMBEDDING_DEPLOYMENT=your-embedding-deployment
AZURE_OPENAI_CHAT_DEPLOYMENT=your-chat-deployment
AZURE_OPENAI_API_VERSION=2023-05-15
```
### 2. Hugging Face Spaces (Repository Secrets)
Go to your Space's **Settings** > **Repository secrets** and add the following secrets:
- `AZURE_OPENAI_API_KEY`
- `AZURE_OPENAI_ENDPOINT`
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT` (embedding deployment name)
- `AZURE_OPENAI_CHAT_DEPLOYMENT` (chat deployment name)
- `AZURE_OPENAI_API_VERSION`
**Note:** The code now passes `deployment_name` as a top-level argument to AzureOpenAIEmbeddings and AzureChatOpenAI, matching the latest LangChain and OpenAI SDK requirements. Do not set deployment_name inside `model_kwargs`.
**Do not upload your .env file to the repository for security reasons.**
## Azure OpenAI Environment Variables
You must set the following environment variables for the app to work (either in Hugging Face Spaces secrets or a local `.env` file):
```
AZURE_OPENAI_API_KEY=your-azure-openai-key
AZURE_OPENAI_ENDPOINT=https://your-resource-name.openai.azure.com/
AZURE_OPENAI_EMBEDDING_DEPLOYMENT=your-embedding-deployment
AZURE_OPENAI_CHAT_DEPLOYMENT=your-chat-deployment
AZURE_OPENAI_API_VERSION=2023-05-15
```
- `AZURE_OPENAI_EMBEDDING_DEPLOYMENT` and `AZURE_OPENAI_CHAT_DEPLOYMENT` are used as the `deployment_name` argument in the code.
- Only `AZURE_OPENAI_API_VERSION` is required for both embeddings and chat.
- **Do not** set `deployment_name` inside `model_kwargs`βit is passed as a top-level argument.
**Example usage in code:**
```python
AzureOpenAIEmbeddings(
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version=os.environ["AZURE_OPENAI_API_VERSION"],
deployment_name=os.environ["AZURE_OPENAI_EMBEDDING_DEPLOYMENT"]
)
AzureChatOpenAI(
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version=os.environ["AZURE_OPENAI_API_VERSION"],
deployment_name=os.environ["AZURE_OPENAI_CHAT_DEPLOYMENT"]
)
```
**Do not upload your .env file to the repository for security reasons.** |