Spaces:
Running
Running
Update contributions
Browse files
README.md
CHANGED
|
@@ -12,16 +12,17 @@ license: mit
|
|
| 12 |
|
| 13 |
# SlideDeck AI
|
| 14 |
|
| 15 |
-
We spend a lot of time
|
| 16 |
-
With SlideDeck AI, co-create slide decks on any topic with Generative Artificial Intelligence
|
| 17 |
-
Describe your topic and let SlideDeck AI generate a PowerPoint slide deck for you—it's as simple as that!
|
|
|
|
| 18 |
|
| 19 |
## Star History
|
| 20 |
|
| 21 |
[](https://star-history.com/#barun-saha/slide-deck-ai&Date)
|
| 22 |
|
| 23 |
|
| 24 |
-
|
| 25 |
|
| 26 |
SlideDeck AI works in the following way:
|
| 27 |
|
|
@@ -40,11 +41,11 @@ Clicking on the button will download the file.
|
|
| 40 |
In addition, SlideDeck AI can also create a presentation based on PDF files.
|
| 41 |
|
| 42 |
|
| 43 |
-
|
| 44 |
|
| 45 |
SlideDeck AI allows the use of different LLMs from several online providers—Azure OpenAI, Google, Cohere, Together AI, and OpenRouter. Most of these service providers offer generous free usage of relevant LLMs without requiring any billing information.
|
| 46 |
|
| 47 |
-
Based on several experiments, SlideDeck AI generally recommends the use of Mistral NeMo
|
| 48 |
|
| 49 |
The supported LLMs offer different styles of content generation. Use one of the following LLMs along with relevant API keys/access tokens, as appropriate, to create the content of the slide deck:
|
| 50 |
|
|
@@ -62,25 +63,24 @@ The supported LLMs offer different styles of content generation. Use one of the
|
|
| 62 |
| Llama 3.3 70B Instruct Turbo | Together AI (`to`) | Mandatory; [get here](https://api.together.ai/settings/api-keys) | Slower, detailed |
|
| 63 |
| Llama 3.1 8B Instruct Turbo 128K | Together AI (`to`) | Mandatory; [get here](https://api.together.ai/settings/api-keys) | Faster, shorter |
|
| 64 |
|
| 65 |
-
**IMPORTANT**: SlideDeck AI does **NOT** store your API keys/tokens or transmit them elsewhere. If you provide your API key, it is only used to invoke the relevant LLM to generate contents. That's it! This is an
|
| 66 |
Open-Source project, so feel free to audit the code and convince yourself.
|
| 67 |
|
| 68 |
In addition, offline LLMs provided by Ollama can be used. Read below to know more.
|
| 69 |
|
| 70 |
|
| 71 |
-
|
| 72 |
|
| 73 |
-
SlideDeck AI uses a subset of icons from [bootstrap-icons-1.11.3](https://github.com/twbs/icons)
|
| 74 |
-
(MIT license) in the slides. A few icons from [SVG Repo](https://www.svgrepo.com/)
|
| 75 |
(CC0, MIT, and Apache licenses) are also used.
|
| 76 |
|
| 77 |
|
| 78 |
-
|
| 79 |
|
| 80 |
SlideDeck AI uses LLMs via different providers. To run this project by yourself, you need to use an appropriate API key, for example, in a `.env` file.
|
| 81 |
Alternatively, you can provide the access token in the app's user interface itself (UI).
|
| 82 |
|
| 83 |
-
|
| 84 |
|
| 85 |
SlideDeck AI allows the use of offline LLMs to generate the contents of the slide decks. This is typically suitable for individuals or organizations who would like to use self-hosted LLMs for privacy concerns, for example.
|
| 86 |
|
|
@@ -102,9 +102,9 @@ git lfs install
|
|
| 102 |
|
| 103 |
ollama list # View locally available LLMs
|
| 104 |
export RUN_IN_OFFLINE_MODE=True # Enable the offline mode to use Ollama
|
| 105 |
-
git clone https://github.com/barun-saha/slide-deck-ai.git
|
| 106 |
cd slide-deck-ai
|
| 107 |
-
git lfs pull # Pull the PPTX template files
|
| 108 |
|
| 109 |
python -m venv venv # Create a virtual environment
|
| 110 |
source venv/bin/activate # On a Linux system
|
|
@@ -139,12 +139,13 @@ SlideDeck AI has won the 3rd Place in the [Llama 2 Hackathon with Clarifai](http
|
|
| 139 |
# Contributors
|
| 140 |
|
| 141 |
SlideDeck AI is glad to have the following community contributions:
|
| 142 |
-
- [
|
| 143 |
-
- [Aditya](https://github.com/AdiBak): added support for page range selection for PDF files.
|
| 144 |
- [Sagar Bharatbhai Bharadia](https://github.com/sagarbharadia17): added support for Gemini 2.5 Flash Lite and Gemini 2.5 Flash LLMs.
|
|
|
|
|
|
|
| 145 |
|
| 146 |
Thank you all for your contributions!
|
| 147 |
|
| 148 |
-
[](https://star-history.com/#barun-saha/slide-deck-ai&Date)
|
| 23 |
|
| 24 |
|
| 25 |
+
## Process
|
| 26 |
|
| 27 |
SlideDeck AI works in the following way:
|
| 28 |
|
|
|
|
| 41 |
In addition, SlideDeck AI can also create a presentation based on PDF files.
|
| 42 |
|
| 43 |
|
| 44 |
+
## Summary of the LLMs
|
| 45 |
|
| 46 |
SlideDeck AI allows the use of different LLMs from several online providers—Azure OpenAI, Google, Cohere, Together AI, and OpenRouter. Most of these service providers offer generous free usage of relevant LLMs without requiring any billing information.
|
| 47 |
|
| 48 |
+
Based on several experiments, SlideDeck AI generally recommends the use of **Mistral NeMo**, **Gemini Flash**, and **GPT-4o** to generate the slide decks.
|
| 49 |
|
| 50 |
The supported LLMs offer different styles of content generation. Use one of the following LLMs along with relevant API keys/access tokens, as appropriate, to create the content of the slide deck:
|
| 51 |
|
|
|
|
| 63 |
| Llama 3.3 70B Instruct Turbo | Together AI (`to`) | Mandatory; [get here](https://api.together.ai/settings/api-keys) | Slower, detailed |
|
| 64 |
| Llama 3.1 8B Instruct Turbo 128K | Together AI (`to`) | Mandatory; [get here](https://api.together.ai/settings/api-keys) | Faster, shorter |
|
| 65 |
|
| 66 |
+
> **IMPORTANT**: SlideDeck AI does **NOT** store your API keys/tokens or transmit them elsewhere. If you provide your API key, it is only used to invoke the relevant LLM to generate contents. That's it! This is an
|
| 67 |
Open-Source project, so feel free to audit the code and convince yourself.
|
| 68 |
|
| 69 |
In addition, offline LLMs provided by Ollama can be used. Read below to know more.
|
| 70 |
|
| 71 |
|
| 72 |
+
## Icons
|
| 73 |
|
| 74 |
+
SlideDeck AI uses a subset of icons from [bootstrap-icons-1.11.3](https://github.com/twbs/icons) (MIT license) in the slides. A few icons from [SVG Repo](https://www.svgrepo.com/)
|
|
|
|
| 75 |
(CC0, MIT, and Apache licenses) are also used.
|
| 76 |
|
| 77 |
|
| 78 |
+
## Local Development
|
| 79 |
|
| 80 |
SlideDeck AI uses LLMs via different providers. To run this project by yourself, you need to use an appropriate API key, for example, in a `.env` file.
|
| 81 |
Alternatively, you can provide the access token in the app's user interface itself (UI).
|
| 82 |
|
| 83 |
+
### Offline LLMs Using Ollama
|
| 84 |
|
| 85 |
SlideDeck AI allows the use of offline LLMs to generate the contents of the slide decks. This is typically suitable for individuals or organizations who would like to use self-hosted LLMs for privacy concerns, for example.
|
| 86 |
|
|
|
|
| 102 |
|
| 103 |
ollama list # View locally available LLMs
|
| 104 |
export RUN_IN_OFFLINE_MODE=True # Enable the offline mode to use Ollama
|
| 105 |
+
git clone [https://github.com/barun-saha/slide-deck-ai.git](https://github.com/barun-saha/slide-deck-ai.git)
|
| 106 |
cd slide-deck-ai
|
| 107 |
+
git lfs pull # Pull the PPTX template files - ESSENTIAL STEP!
|
| 108 |
|
| 109 |
python -m venv venv # Create a virtual environment
|
| 110 |
source venv/bin/activate # On a Linux system
|
|
|
|
| 139 |
# Contributors
|
| 140 |
|
| 141 |
SlideDeck AI is glad to have the following community contributions:
|
| 142 |
+
- [Aditya](https://github.com/AdiBak): added support for page range selection for PDF files and new chat button.
|
|
|
|
| 143 |
- [Sagar Bharatbhai Bharadia](https://github.com/sagarbharadia17): added support for Gemini 2.5 Flash Lite and Gemini 2.5 Flash LLMs.
|
| 144 |
+
- [Sairam Pillai](https://github.com/sairampillai): unified the project's LLM access by migrating the API calls to **LiteLLM**.
|
| 145 |
+
- [Srinivasan Ragothaman](https://github.com/rsrini7): added OpenRouter support and API keys mapping from the `.env` file.
|
| 146 |
|
| 147 |
Thank you all for your contributions!
|
| 148 |
|
| 149 |
+
[](#contributors)
|
| 150 |
|
| 151 |
|