PandasAI supports several large language models (LLMs) that are used to generate code from natural language queries.
OPENAI_API_KEY
environment variable and instantiate the OpenAI
object without
passing the API key:
openai_proxy
when instantiating the OpenAI
object or set
the OPENAI_PROXY
environment variable to pass through.
google-cloud-aiplatform
gcloud
AZURE_OPENAI_API_KEY
, OPENAI_API_VERSION
, and AZURE_OPENAI_ENDPOINT
environment
variables and instantiate the Azure OpenAI object without passing them:
openai_proxy
when instantiating the AzureOpenAI
object or set
the OPENAI_PROXY
environment variable to pass through.
inference_server_url
is the only required parameter to instantiate an HuggingFaceTextGen
model:
langchain
package:
langchain
package, you can use it to instantiate a LangChain object:
bedrock
package.
Project -> Manage -> General -> Details
). The service url depends on the region of the
provisioned service instance and can be
found here.
In order to use watsonx.ai models, you need to install the ibm-watsonx-ai
package.
At this time, watsonx.ai does not support the PandasAI agent.