LM Studio
7437 lines. LM Studio came to play.
LM Studio is your go-to desktop app for running LLMs like Llama and DeepSeek locally. With a user-friendly chat interface and easy model management, it empowers developers to experiment and innovate right from their computers.
Not sure yours is this good? Check it →
LM Studio's llms.txt Insights
Overachiever
149 sections. Most sites can barely manage 3. This one went all in.
War and Peace vibes
7437 lines. They really wanted AI to understand them.
Double trouble
Runs both llms.txt and llms-full.txt. Someone takes this seriously.
What's inside LM Studio's llms.txt
LM Studio's llms.txt contains 11 sections:
- app
- About LM Studio
- How do I install LM Studio?
- System requirements
- Run llama.cpp (GGUF) or MLX models
- Run an LLM like `Llama`, `Phi`, or `DeepSeek R1` on your computer
- Chat with documents entirely offline on your computer
- Use LM Studio's API from your own apps and scripts
- Community
- API Changelog
- Idle TTL and Auto Evict
How does LM Studio's llms.txt compare?
| LM Studio | Directory Avg | Top Performer | |
|---|---|---|---|
| Lines | 7,437 | 1029 | 163,447 |
| Sections | 149 | 17 | 3207 |
Cool table. Now the real question — where do you land? Find out →
LM Studio's llms.txt preview
First 100 of 7,437 lines
# app
# About LM Studio
> Learn how to run Llama, DeepSeek, Phi, and other LLMs locally with LM Studio.
LM Studio is a desktop app for developing and experimenting with LLMs locally on your computer.
**Key functionality**
1. A desktop application for running local LLMs
2. A familiar chat interface
3. Search & download functionality (via Hugging Face 🤗)
4. A local server that can listen on OpenAI-like endpoints
5. Systems for managing local models and configurations
<hr>
### How do I install LM Studio?
Head over to the [Downloads page](/download) and download an installer for your operating system.
LM Studio is available for macOS, Windows, and Linux.
<hr>
### System requirements
LM Studio generally supports Apple Silicon Macs, x64/ARM64 Windows PCs, and x64 Linux PCs.
Consult the [System Requirements](app/system-requirements) page for more detailed information.
<hr>
### Run llama.cpp (GGUF) or MLX models
LM Studio supports running LLMs on Mac, Windows, and Linux using [`llama.cpp`](https://github.com/ggerganov/llama.cpp).
On Apple Silicon Macs, LM Studio also supports running LLMs using Apple's [`MLX`](https://github.com/ml-explore/mlx).
To install or manage LM Runtimes, press `⌘` `Shift` `R` on Mac or `Ctrl` `Shift` `R` on Windows/Linux.
<hr>
### Run an LLM like `Llama`, `Phi`, or `DeepSeek R1` on your computer
To run an LLM on your computer you first need to download the model weights.
You can do this right within LM Studio! See [Download an LLM](app/basics/download-model) for guidance.
<hr>
### Chat with documents entirely offline on your computer
You can attach documents to your chat messages and interact with them entirely offline, also known as "RAG".
Read more about how to use this feature in the [Chat with Documents](app/basics/rag) guide.
### Use LM Studio's API from your own apps and scripts
LM Studio provides a REST API that you can use to interact with your local models from your own apps and scripts.
- [OpenAI Compatibility API](api/openai-api)
- [LM Studio REST API (beta)](api/rest-api)
<hr>
### Community
Join the LM Studio community on [Discord](https://discord.gg/aPQfnNkxGC) to ask questions, share knowledge, and get help from other users and the LM Studio team.
## API Changelog
> LM Studio API Changelog - new features and updates
###### [👾 LM Studio 0.3.9](blog/lmstudio-v0.3.9) • 2025-01-30
### Idle TTL and Auto Evict
Set a TTL (in seconds) for models loaded via API requests (docs article: [Idle TTL and Auto-Evict](/docs/api/ttl-and-auto-evict))
```diff
curl http://localhost:1234/api/v0/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-r1-distill-qwen-7b",
"messages": [ ... ]
+ "ttl": 300,
}'
```
With `lms`:
```
lms load --ttl <seconds>
```
What is llms.txt?
llms.txt is an open standard that helps AI language models understand your website. By placing a structured markdown file at /llms.txt, websites provide AI search engines like ChatGPT, Claude, and Perplexity with a clear map of their content, services, and documentation. Companies like LM Studio use it to ensure AI accurately represents their brand when answering user queries. Read the spec.
LM Studio showed up. Where's yours?
1000+ companies didn't overthink it. 60 seconds. Go.
Check your site →