Spaces:
Sleeping
Sleeping
| title: MedCodeMCP | |
| emoji: 💬 | |
| colorFrom: yellow | |
| colorTo: purple | |
| sdk: gradio | |
| sdk_version: 5.33.0 | |
| app_file: app.py | |
| pinned: false | |
| license: apache-2.0 | |
| short_description: an MCP Tool for Symptom-to-ICD Diagnosis Mapping. | |
| tags: | |
| - mcp-server-track | |
| - MistralTeam | |
| A voice‑enabled medical assistant that takes patient audio complaints, engages in follow‑up questions, and returns structured ICD‑10 diagnosis suggestions via an MCP endpoint. fileciteturn2file0 | |
| # Features | |
| - **Audio input & ASR**: Use Whisper to transcribe real‑time patient audio (e.g. “I’ve had a dry cough for three days”). | |
| - **Interactive Q&A agent**: The LLM asks targeted clarifications (“Is your cough dry or productive?”) until ready to diagnose. | |
| - **Multi‑backend LLM**: Switch dynamically between OpenAI GPT, Mistral (HF), or any local transformers model via env flags. | |
| - **ICD‑10 mapping**: Leverage LlamaIndex to vector‑retrieve the most probable ICD‑10 codes with confidence scores. | |
| - **MCP‑server ready**: Exposes a `/mcp` REST endpoint for seamless integration with agent frameworks. | |
| # Getting Started | |
| ## Clone & Install | |
| ```bash | |
| git clone https://huggingface.co/spaces/gpaasch/Grahams_Gradio_Agents_MCP_Hackathon_2025_Submission.git | |
| cd Grahams_Gradio_Agents_MCP_Hackathon_2025_Submission | |
| python3 -m venv .venv && source .venv/bin/activate | |
| pip install -r requirements.txt | |
| ``` | |
| ## Environment Variables | |
| | Name | Description | Default | | |
| |----------------------|-----------------------------------------------------------|-------------------| | |
| | `OPENAI_API_KEY` | OpenAI API key for GPT calls | none (required) | | |
| | `HUGGINGFACEHUB_API_TOKEN` | HF token for Mistral/inference models | none (required for Mistral) | | |
| | `USE_LOCAL_GPU` | Set to `1` to use a local transformers model (no credits) | `0` | | |
| | `LOCAL_MODEL` | Path or HF ID of local model (e.g. `distilgpt2`) | `gpt2` | | |
| | `USE_MISTRAL` | Set to `1` to use Mistral via HF instead of OpenAI | `0` | | |
| | `MISTRAL_MODEL` | HF ID for Mistral model (`mistral-small/medium/large`) | `mistral-large` | | |
| | `MISTRAL_TEMPERATURE`| Sampling temperature for Mistral | `0.7` | | |
| | `MISTRAL_MAX_INPUT` | Max tokens for input prompt | `4096` | | |
| | `MISTRAL_NUM_OUTPUT` | Max tokens to generate | `512` | | |
| ## Launch Locally | |
| ```bash | |
| # Option A: Default (OpenAI) | |
| python app.py | |
| # Option B: Mistral backend | |
| export USE_MISTRAL=1 | |
| export HUGGINGFACEHUB_API_TOKEN="hf_..." | |
| python app.py | |
| # Option C: Local GPU (no credits) | |
| export USE_LOCAL_GPU=1 | |
| export LOCAL_MODEL="./distilgpt2" | |
| python app.py | |
| ``` | |
| Open http://localhost:7860 to: | |
| 1. Record your symptoms via the **Microphone** widget. | |
| 2. Engage in follow‑up Q&A until the agent returns a JSON diagnosis. | |
| ## MCP API Usage | |
| Send a POST to `/mcp` to call the `transcribe_and_respond` tool programmatically: | |
| ```bash | |
| curl -X POST http://localhost:7860/mcp \ | |
| -H "Content-Type: application/json" \ | |
| -d '{"tool":"transcribe_and_respond","input":{"audio": "<base64_audio>", "history": []}}' | |
| ``` | |
| The response will be a JSON chat history, ending with your final ICD‑10 suggestions. | |
| # Project Structure | |
| ``` | |
| ├── app.py # Root wrapper (HF entrypoint) | |
| ├── src/ | |
| │ └── app.py # Core Gradio & agent logic | |
| ├── utils/ | |
| │ └── llama_index_utils.py # LLM predictor & indexing utils | |
| ├── data/ | |
| │ └── icd10cm_tabular_2025/ # ICD-10 dataset | |
| ├── requirements.txt # Dependencies | |
| └── README.md # This file | |
| ``` | |
| # Contributing & Support | |
| - Open an issue or discussion on the [Hugging Face Space](https://huggingface.co/spaces/gpaasch/Grahams_Gradio_Agents_MCP_Hackathon_2025_Submission/discussions). | |
| - Tag `@MistralTeam` to qualify for the \$2,000 Mistral prize. | |
| - Post on Discord in the **#hackathon** channel for live help. | |
| --- | |