We’ve removed all latest tags from Instill Core service images. Going forward, every service must reference a specific Git tag or commit hash — making dependencies more stable and traceable.

What’s new

  • Services no longer use latest — everything is now explicitly versioned
  • Better reproducibility across environments

How to run Instill Core CE

  • Stable version (recommended for most users):
git clone -b $VERSION https://github.com/instill-ai/instill-core.git && cd instill-core
make run

Replace $VERSION with the latest release tag.

  • Development version (for contributors and latest CE features):
git clone https://github.com/instill-ai/instill-core.git && cd instill-core
make compose-dev

Please update your local setup or deployment scripts accordingly.

Observability

by Instill AI

Instill Core's tracing and logging with OpenTelemetry and Grafana Stack

Built-in observability is now available in Instill Core CE via Docker Compose — powered by OpenTelemetry and the Grafana Stack. This includes tracing with Tempo and logging with Loki, with zero manual setup.

Simply enable with make all OBSERVE_ENABLED=true and access Grafana at localhost:3001.


💫 Improvements

Artifact

  • Allowed duplicate file names within an artifact catalog.

🐛 Bug Fixes

Artifact

  • Returned pre-signed URLs directly as download links in the Files API.
  • Fixed empty download link issue in the Files API to support the legacy upload method.

Component

  • Document operator: Logged errors and returned a blank body when pdfplumber fails.

We’ve enabled Model Context Protocol (MCP) support for the Instill Core documentation — so your favorite AI tools can now talk to our docs and APIs directly.

What’s new:

  • Instill Core now exposes a public MCP server: https://docs.instill-ai.com/mcp
  • Out-of-the-box support for AI tools like Cursor, Claude Desktop, and Windsurf
  • AI assistants can search our docs, generate code, and even call live API endpoints — powered by our OpenAPI spec
  • No extra setup required — this works automatically with our ReadMe-powered documentation site

What can you do?

Try asking your AI Assistant:

"What is Instill Core?"

"Show me an example of using OpenAI component in a pipeline?"

"How do I run a pipeline on Instill Core?"

Whether you're exploring Instill Core for the first time or building complex workflows, you can now supercharge your development with AI-assisted documentation and API access.

👉 Check the full guide here: https://docs.instill-ai.com/docs/mcp

Start building faster — with your AI tools fully connected to Instill Core.

We’ve launched an all-new documentation site for Instill Core — easier to navigate and designed to help you find what you need, fast.

What’s new:

  • New URL: Visit docs.instill-ai.com
  • Cleaner API examples with copy-ready snippets throughout the site
  • Docs, API reference, and changelog — all in one place

The previous site at will be retired soon — please update your bookmarks.

Whether you’re just getting started or deep into production, the new site makes working with Instill Core smoother than ever.

Instill Core now supports multiple LLM runtimes, including MLC LLM , vLLM and Transformers — giving you more flexibility to run fast, efficient, and scalable LLMs on your preferred runtime engine.

What’s new:

You can now specify the desired LLM runtime in your model’s instill.yaml:

build:
  # (Required) Set to true if your model requires GPU.
  gpu: true

  # (Optional) LLM runtime to use.
  # Supported mlc-llm, vllm, transformers
  llm_runtime: vllm

Once defined, and paired with the corresponding model implementation, run instill build to containerize your model and instill push to deploy the model in Instill Core.

For full details on model building, see the documentation.

Whether you’re optimizing for performance, cost, or deployment footprint, Instill Core now empowers you to serve LLMs your way.

We’ve added a new task to the Document Operator — you can now split a document into individual pages, with support for custom batch sizes, enabling more controlled workflows.

Use cases:

  • Pre-process long PDFs before parsing
  • Apply AI operations in page batches
  • Improve downstream precision

📄 Example pipeline recipe:

version: v1beta
variable:
  document:
    type: document
    description: Input document
component:
  document-op:
    type: document
    task: TASK_SPLIT_IN_PAGES
    input:
      document: ${variable.document}
      batch-size: 8
output:
  page-batches:
    title: Document page batches
    value: ${document-op.output.batches}

This splits the input document into batches of 8 pages.

Instill Core CE

This release brings UI refinements to the Console, enhancements to development tooling, and updated documentation to make it easier for new contributors to get started.

💫 Improvements

Development Tooling

  • Added missing pipeline and model configuration files to docker-compose.

Documentation

  • Updated contribution guidelines to help new collaborators onboard more smoothly.

🐛 Bug Fixes

Console

  • Fixed visual glitch on the Settings > Integration page.
  • Restored missing Perplexity icon on the Settings > Integration page.
  • Removed the Chat tab from the UI.
  • Removed duplicate top navigation bar when creating a model.

Introducing the new Run-on-Event feature on 💧 Instill VDP to enhance pipeline automation by listening to events from other services so that VDP can provide corresponding actions.

What's New

  • A star created event from GitHub
  • A new message event from Slack

Run-on-Event enables you to:

  • Trigger downstream processes by listening to events
  • Automate notifications & alerts
  • Enable real-time data synchronization across different services

Send a message to Slack on a GitHub star event

Pre-requisites:

When someone stars your GitHub repo, automatically send a message to your designated Slack channel.

Customize your event messages

See the example recipe below:

# VDP Version
version: v1beta

on:
  github:
    type: github
    event: EVENT_STAR_CREATED
    config: 
      repository: <USERNAME/YOUR-GITHUB-REPO> # i.e. instill-ai/instill-core
    setup: ${connection.<YOUR-GITHUB-CONNECTION>}
  
variable:
  repository:
    title: repository
    format: string
    listen:
      - ${on.github.message.repository.full-name}
  user:
    title: user
    format: string
    listen:
      - ${on.github.message.sender.login}
  description:
    title: user
    format: string
    listen:
      - ${on.github.message.repository.description}
  stargazers-count:
    title: stargazers-count
    format: number
    listen:
      - ${on.github.message.repository.stargazers-count}
    
output:
  text:
    value: ${variable.repository} ${variable.user}

component:
  slack:
    type: slack
    input:
      channel-name: <YOUR-SLACK-CHANNEL-NAME> # i.e. notify-github-star
      message: |
        ${variable.user} star ${variable.repository}
        Repo Description: ${variable.description}
        Stargazers Count: ${variable.stargazers-count}
      as-user: false
    condition:
    setup: ${connection.<YOUR-SLACK-CONNECTION>}
    task: TASK_WRITE_MESSAGE  

The GitHub star and Slack message examples demonstrate how Run-On-Event can bridge different services, allowing VDP to create responsive, event-driven workflows that enhance overall system automation and integration.

Learn more about this feature by joining ourDiscord where we showcase these new features.

Bug Fixes

  • Fixed bug for cloning features

To report a bug or improvement, please create an issue on GitHubhere.

OAuth is an open-standard authorization protocol or framework that provides applications the ability for “secure designated access.

Today, we upgraded the GitHub, Slack and Google Drive Components to use OAuth for a quicker and simpler authentication process to connect to your external apps.

To get started:

  1. Sign in to the Console
  2. Navigate to your top right Profile > Setting > Integrations
  3. Search for GitHub, Slack or Google Drive and click connect

Key Benefits:

  • Secure Connections: Access your resources safely without sharing passwords.
  • Easy Access Management: Quickly control who can access connected services.
  • Automatic Setup: No manual IDs needed—connections are set up automatically.
  • Seamless Integration: Connect to top platforms in just a click, simplifying your workflow.

We will be adding more integrations very soon so stay tuned!

Learn more about this feature by joining ourDiscord where we showcase these new features.

The Dashboard is a centralized view for observing all your pipeline and model run activity and Instill Credit consumption costs. This view includes a chart for data visualization and a table for your run history logs.

This means you can easily review and calculate any future costs associated with your pipeline and model usage. Simply log into your Instill Cloud account and go to the Dashboard tab to check it out!

Learn more about this feature by joining ourDiscord where we showcase these new features.

Improvements

Pipeline

  • Auto-save is now triggered automatically when clicking the "Run" button.
  • Added a scaling hint message for the Instill Model component in VDP.
  • Updated post-onboarding landing page to redirect users to the "Explore" page.

Console:

  • Standardized logo sizes on the "About" page for visual consistency.

Python SDK

  • Removed _service attribute naming for cleaner code.
  • Added support for specifying target namespaces in endpoint functions.
  • Aligned initialization functions across all service clients for consistency.
  • Removed the outdated configuration system.

Bug Fixes

Pipeline

  • Fixed issue where the iterator component appeared disconnected in the preview canvas.

Model

  • Fixed error when running a model stuck in the "Scaling" stage after 15 minutes.