Contact Us 1-800-596-4880

MuleSoft Inference Connector Release Notes

Anypoint Connector for MuleSoft Inference (MuleSoft Inference Connector) provides access to inference offerings for large language models (LLMs) from multiple providers including OpenAI, OpenAI Compatible Endpoints, OpenRouter, Heroku AI, Azure AI Foundry, Azure OpenAI, and many others. This connector provides operations to interface directly with the API of various inference providers, enabling seamless integration of AI capabilities into your Mule applications.

1.1.0

September 8, 2025

What’s New

  • The connector now supports the Custom headers field for all operations. Custom headers always take precedence over any existing headers passed as part of an implementation, overwriting values where the same key is used.

  • The Nvidia connection type now includes the custom [Nvidia] Base URL field for sending inference requests, with https://integrate.api.nvidia.com as the default. This supports custom and self-hosted URLs.

  • The Messages field in the [Chat] Completions operation now supports DataSense, which provides schema support.

  • The Tools field in the [Tools] Native Template (Reasoning only) operation now supports DataSense, which provides schema support.

  • The [Agent] Define Prompt Template, [Chat] Answer Prompt, [Chat] Completions, and [Tools] Native Template (Reasoning only) operations now provide default support for toxicity detection by returning the new contentFilterResults and promptFilterResults additional attributes in the response.

  • The connector now supports the Gemini connection type (beta) in Text Generation Config and Vision Config.

  • In the Azure OpenAI connection type:

    • The new [Azure OpenAI] API Version and [Azure OpenAI] Endpoint fields are added to the General tab.

    • In the General tab, there is a new section called Required: Enter URL or Parameters which has the [Azure OpenAI] Endpoint, [Azure OpenAI] Resource Name, and [Azure OpenAI] Deployment ID fields. In this section, you can either provide the default or a custom endpoint URL (which is used directly and overrides any parameter values), or provide the resource name and deployment ID (which are used to construct the base URL).

    • The [Azure OpenAI] User field is moved from the General tab to the Advanced tab.

For more information, see MuleSoft Inference Connector Reference.

Compatibility

Software Version

Mule

4.9.4 and later

OpenJDK

17

Fixed Issues

Issue Resolution ID

The [Tools] Native Template (Reasoning only) operation now supports a full JSON schema for input parameters, including nested structures and tags such as items, minItems, and uniqueItems.

W-19271915

The [Chat] Completions operation using the Vertex AI Express connection type now supports OpenAI standard input template for chat messages.

W-19483173

1.0.0

August 1, 2025

What’s New

This is a new connector.

For more information, see the MuleSoft Inference Connector User Guide.

Compatibility

Software Version

Mule

4.9.4 and later

OpenJDK

17