Creates a new audio track in the Ableton Live session
Creates a new MIDI track in the Ableton Live session
Creates a new return track in the Ableton Live session
Deletes a track by its index
Gets the names of all tracks in the session
Sets the name of a track by its index
Gets the volume of a track by its index
Sets the volume of a track by its index
Gets the pan position of a track by its index
Sets the pan position of a track by its index
Gets the mute state of a track by its index
Sets the mute state of a track by its index
Gets the solo state of a track by its index
Sets the solo state of a track by its index
Gets the arm state of a track by its index
Sets the arm state of a track by its index
Gets the names of all clips in a track
Plays a clip by track and clip index
Stops a clip by track and clip index
Gets the names of all devices on a track
Gets the parameters of a device on a track
Sets a parameter value for a device on a track
Gets the input routing of a track
Sets the input routing of a track
Gets the output routing of a track
Sets the output routing of a track
Gets the current tempo of the session
Sets the tempo of the session
Starts playback of the session
Stops playback of the session
Gets the current state of the Ableton Live session
The Ableton Live Controller enables AI assistants to interact with Ableton Live digital audio workstation through natural language commands. Using the Open Sound Control (OSC) protocol, it creates a bridge between language models and music production software, allowing for intuitive control of tracks, clips, devices, and mixing parameters. This implementation provides comprehensive access to Ableton Live's functionality through a Model Context Protocol server, making music production more accessible and efficient. Whether you're setting up recording sessions, adjusting parameters, or managing complex projects, this controller translates natural language instructions into precise Ableton Live operations.
The Ableton Live Controller enables AI assistants to control Ableton Live through natural language commands. It uses the Open Sound Control (OSC) protocol to communicate with Ableton Live and implements the Model Context Protocol (MCP) to facilitate communication between language models and the music production software.
Before installing the Ableton Live Controller, ensure you have:
uv
package manager (recommended)If you don't have uv
installed, you can install it with:
curl -LsSf https://astral.sh/uv/install.sh | sh
git clone https://github.com/Simon-Kansara/ableton-live-mcp-server.git
cd ableton-live-mcp-server
uv sync
Follow the installation instructions at AbletonOSC GitHub repository to set up the control surface in Ableton Live.
The OSC daemon handles communication between the MCP server and Ableton Live:
uv run osc_daemon.py
This will:
By default, the server and daemon run with the following settings:
To modify these settings, edit the AbletonOSCDaemon
class in osc_daemon.py
.
To use the Ableton Live Controller with Claude Desktop, add the following configuration to your Claude Desktop settings file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%/Claude/claude_desktop_config.json
Add the configuration as shown in the installation section below.
Once configured, you can ask Claude questions like:
The controller will translate these natural language commands into the appropriate OSC messages to control Ableton Live.