Docs Menu
Docs Home
/
Relational Migrator
/ /

Set the Query Converter AI Provider and Large Language Model

On this page

  • Changing the LLM
  • Before You Begin
  • Steps
  • Stop the Relational Migrator executable or service.
  • Open the configuration file.
  • Configure Relational Migrator for your LLM.
  • Save and close the file, and restart Relational Migrator.
  • Next Steps

Query Converter generates application code to help migrate an application from its original database into MongoDB. Query Converter uses Generative AI (GenAI). For more information on how MongoDB uses GenAI, see AI and Data Usage Information and the Generative AI FAQ

You can configure Query Converter to use a customer-hosted Large Language Model (LLM) instead of using the default GPT-4o model on the Azure OpenAI service. When configured to use a locally-hosted LLM service, Relational Migrator doesn't require internet access for query conversion.

Important

Query Converter has been tested and optimized with GPT-4o. If you change the LLM, test Query Converter output to verify that code quality meets your needs. LLM output varies widely between models. Code quality can also vary within the same model depending on programming language, design patterns, and other factors.

  • Review the list of supported providers and models.

  • Configure your AI provider. This can include configuring access to the service, creating an API key or access credentials for Query Converter, or provisioning other resources. See your AI provider's documentation for details.

1
2

This file is located at:

  • MacOS: ~/Library/Application Support/MongoDB/Relational Migrator/user.properties

  • Windows: C:\Users\%USERNAME%\AppData\Local\MongoDB\Relational Migrator\Data\user.properties

  • Linux: ~/Migrator/user.properties

3

Add or set the following migrator.queryconversion.llm.options. If you're changing your LLM configuration, remove any that aren't used for the new LLM.

With Azure OpenAI, Query Converter supports the GPT-4o and GPT-4 models.

Example

migrator.queryconversion.llm.options.provider: AzureOpenAI
migrator.queryconversion.llm.options.apiKey: <API key>
migrator.queryconversion.llm.options.apiVersion: 2024-10-21
migrator.queryconversion.llm.options.deployment: myDeployment
migrator.queryconversion.llm.options.baseUrl: https://my-test-endpoint.openai.azure.com/
migrator.queryconversion.llm.options.model: gpt-4o

Configuration

Field

Value

migrator.queryconversion.llm.options.provider

AzureOpenAI

migrator.queryconversion.llm.options.apiKey

Your Azure OpenAI API key.

migrator.queryconversion.llm.options.apiVersion

The Azure OpenAI API version, in the format YYYY-MM-DD.

migrator.queryconversion.llm.options.deployment

The Azure OpenAI deployment name, found in the Model Deployments page of the Azure portal.

migrator.queryconversion.llm.options.baseUrl

The Azure OpenAI endpoint, found in the Keys and Endpoint section in the Azure portal.

migrator.queryconversion.llm.options.model

gpt-4 or gpt-4o

For more information, see the Azure OpenAI documentation.

With OpenAI, Query Converter supports the GPT-4o and GPT-4 models.

Example

migrator.queryconversion.llm.options.provider: OpenAI
migrator.queryconversion.llm.options.apiKey: <API key>
migrator.queryconversion.llm.options.baseUrl: https://api.openai.com/v1
migrator.queryconversion.llm.options.model: gpt-4

Configuration

Field

Value

migrator.queryconversion.llm.options.provider

OpenAI

migrator.queryconversion.llm.options.apiKey

Your OpenAI API key, generated in the OpenAI dashboard.

migrator.queryconversion.llm.options.baseUrl

https://api.openai.com/v1

migrator.queryconversion.llm.options.model

gpt-4 or gpt-4o

For more information, see the OpenAI documentation.

With Amazon Bedrock, Query Converter supports the Claude 3.5 (Sonnet) and Mistral Large models.

Example

migrator.queryconversion.llm.options.provider: AWSBedrock
migrator.queryconversion.llm.options.awsAccessKeyId: <Access key>
migrator.queryconversion.llm.options.awsSecretAccessKey: <Secret key>
migrator.queryconversion.llm.options.regionName: us-east-1
migrator.queryconversion.llm.options.modelId: anthropic.claude-3-5-sonnet-20241022-v2:0

Configuration

Field

Value

migrator.queryconversion.llm.options.provider

AWSBedrock

migrator.queryconversion.llm.options.awsAccessKeyId

The ID for an AWS access key associated with an IAM account.

migrator.queryconversion.llm.options.awsSecretAccessKey

The AWS access key for the account.

migrator.queryconversion.llm.options.regionName

The regional endpoint where the model is deployed.

migrator.queryconversion.llm.options.modelId

anthropic.claude-3-5-sonnet-20241022-v2:0 or mistral.mistral-large-2407-v1:0

For more information, see the Amazon Bedrock documentation, or the model-specific documentation from Anthropic or Mistral.

With GCP Vertex AI, Query Converter supports the Gemini 1.5 Pro model.

Example

migrator.queryconversion.llm.options.provider: GCPVertex
migrator.queryconversion.llm.options.apiKey: <API key>
migrator.queryconversion.llm.options.model: gemini-1.5-pro-001

Configuration

Field

Value

migrator.queryconversion.llm.options.provider

GCPVertex

migrator.queryconversion.llm.options.apiKey

Leave unset. Instead, set your Application Default Credentials with GCP through one of the other provided mechanisms.

migrator.queryconversion.llm.options.model

gemini-1.5-pro-001 or gemini-1.5-pro-002

For more information, see the Google Vertex AI documentation.

With self-managed local or Cloud deployments, Query Converter supports the Llama 3.1 405B model.

Example

migrator.queryconversion.server.address: http://127.0.0.1:6081
migrator.queryconversion.llm.options.model: Llama-3.1-405B

Configuration

Field

Value

migrator.queryconversion.server.address

For local deployments, the IP address of the Query Converter endpoint.

migrator.queryconversion.llm.options.model

Llama-3.1-405B

4

Back

AI & Data Usage