Edit

Share via


Use Azure OpenAI in Fabric with Python SDK (preview)

This article shows how to use Azure OpenAI in Fabric with the OpenAI Python SDK. For distributed processing of large datasets, see Use Azure OpenAI with SynapseML. For the simplest approach using Pandas AI Functions, see Use Azure OpenAI with AI Functions.

Prerequisites

The default runtime doesn't include the OpenAI Python SDK, so you need to install it.

%pip install -U openai

Create Fabric-authenticated client

To use Azure OpenAI in Fabric, create a client with Fabric's authentication:

from synapse.ml.fabric.credentials import get_openai_httpx_sync_client
import openai

client = openai.AzureOpenAI(
    http_client=get_openai_httpx_sync_client(),
    api_version="2025-04-01-preview",
)

This client handles authentication automatically when running in Fabric notebooks. Use this client for all subsequent API calls.

Chat completions

The example presented here showcases simple chat completion operations. For complete API reference, see Chat Completions API.

response = client.chat.completions.create(
    model="gpt-4.1",
    messages=[
        {
            "role": "user",
            "content": """Analyze the following text and return a JSON array of issue insights.

Each item must include:
- issue_brief (1 sentence)
- scenario
- severity (high | medium | low)
- verbatim_quotes (list)
- recommended_fix

Text:
We booked the hotel room in advance for our family trip. The check-in the great however the room service was slow and pool was closed

Return JSON only.
"""
        }
    ],
)
print(f"{response.choices[0].message.content}")

Responses API

The Responses API is the recommended approach by OpenAI for new implementations. It provides improved response quality and better handling of structured outputs. For complete API reference, see Responses API.

response = client.responses.create(
    model="gpt-4.1",
    input=[
        {
            "role": "user",
            "content": "Explain quantum computing in simple terms."
        }
    ],
    store=False  # Fabric LLM endpoint does not support storage
)
print(f"{response.output_text}")

Note

The Fabric LLM endpoint does not support the store parameter set to True or the previous_response_id parameter.

Embeddings

An embedding is a special data representation format that machine learning models and algorithms can easily utilize. It contains information-rich semantic meaning of a text, represented by a vector of floating point numbers. The distance between two embeddings in the vector space is related to the semantic similarity between two original inputs. For complete API reference, see Embeddings API.

response = client.embeddings.create(
    input="The food was delicious and the waiter...",
    model="text-embedding-ada-002",
)
print(response.data[0].embedding)

Available models and rates

For information about available models and consumption rates, see Foundry Tools consumption rate.

Fabric documentation

OpenAI Python SDK documentation