Update README.md (#2)
Browse files- Update README.md (0bdc7caebb696b6affc08b6960f98a3afa90c0fd)
Co-authored-by: Devin Gulliver <[email protected]>
    	
        README.md
    CHANGED
    
    | @@ -32,7 +32,6 @@ Falcon3-Mamba-7B-Base supports a context length up to 32K and was mainly trained | |
| 32 | 
             
              - 32k context length
         | 
| 33 | 
             
              - 65k vocab size
         | 
| 34 | 
             
            - Continue Pretrained from [Falcon-Mamba-7b](https://arxiv.org/abs/2410.05355), with another 1500 Gigatokens of data consisting of web, code, STEM and high quality data.
         | 
| 35 | 
            -
            - Postrained on 1.2 million samples of STEM, conversations, code, and safety.
         | 
| 36 | 
             
            - Developed by [Technology Innovation Institute](https://www.tii.ae)
         | 
| 37 | 
             
            - License: TII Falcon-LLM License 2.0
         | 
| 38 | 
             
            - Model Release Date: December 2024
         | 
|  | |
| 32 | 
             
              - 32k context length
         | 
| 33 | 
             
              - 65k vocab size
         | 
| 34 | 
             
            - Continue Pretrained from [Falcon-Mamba-7b](https://arxiv.org/abs/2410.05355), with another 1500 Gigatokens of data consisting of web, code, STEM and high quality data.
         | 
|  | |
| 35 | 
             
            - Developed by [Technology Innovation Institute](https://www.tii.ae)
         | 
| 36 | 
             
            - License: TII Falcon-LLM License 2.0
         | 
| 37 | 
             
            - Model Release Date: December 2024
         | 

