Introduction

Nido AI currently provides two types of access to Large Language Models:

  • An API providing pay-as-you-go access to our latest models,

  • Open source models available under the Apache 2.0 License, available on Hugging Face or directly from the documentation.

Where to start?

API Access

Our API is currently in beta to ramp up the load and provide good quality of service. Access the platform to join the waitlist. Once your subscription is active, you can immediately use our chat endpoint:

curl --location "https://api.nido.sg/v1/chat/completions" \
     --header 'Content-Type: application/json' \
     --header 'Accept: application/json' \
     --header "Authorization: Bearer $NIDO_API_KEY" \
     --data '{
    "model": "nido-tiny",
    "messages": [{"role": "user", "content": "Who is the most renowned French painter?"}]
  }'

Or our embeddings endpoint:

curl --location "https://api.nido.sg/v1/embeddings" \
     --header 'Content-Type: application/json' \
     --header 'Accept: application/json' \
     --header "Authorization: Bearer $NIDO_API_KEY" \
     --data '{
    "model": "nido-embed",
    "input": ["Embed this sentence.", "As well as this one."]
  }'

For a full description of the models offered on the API, head on to the model docs.

For more examples on how to use our platform, head on to our platform docs.

Raw model weights

Raw model weights can be used in several ways:

  • For self-deployment, on cloud or on premise, using either TensorRT-LLM or vLLM, head on to Deployment

  • For research, head-on to our reference implementation repository,

  • For local deployment on consumer grade hardware, check out the llama.cpp project or Ollama.

Get Help

Reach out to our business team if you have enterprise needs, want more information about our products or if there are missing features you would like us to add.

Contributing

NIDO AI is committed to open source software development and welcomes external contributions. Please open a PR!

Last updated