Instructions to use OpenNLG/OpenBA-V1-Code with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use OpenNLG/OpenBA-V1-Code with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="OpenNLG/OpenBA-V1-Code", trust_remote_code=True)# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("OpenNLG/OpenBA-V1-Code", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
| from .modeling_openba import OpenBAForConditionalGeneration | |
| from .configuration_openba import OpenBAConfig | |
| from .tokenization_openba import OpenBATokenizer | |
| __all__ = [ | |
| "OpenBAForConditionalGeneration", | |
| "OpenBAConfig", | |
| "OpenBATokenizer", | |
| ] |