Welcome to Jac#
The Only Language You Need to Build Anything
Jac is a programming language designed for humans and AI to build together. It supersets Python and JavaScript with native compilation support, adding constructs that let you weave AI into your code, model complex domains as graphs, and deploy to the cloud -- all without switching languages, managing databases, or writing infrastructure. Jac imagines what should be abstracted away from the developer and automates it through the compiler and runtime.
# A complete full-stack AI app in one file
node Todo {
has title: str, done: bool = False;
}
enum Category { WORK, PERSONAL, SHOPPING, HEALTH, OTHER }
def categorize(title: str) -> Category
by llm();
def:pub get_todos -> list {
if not [root-->](?:Todo) {
root ++> Todo(title="Buy groceries");
root ++> Todo(title="Finish report");
}
return [{"title": t.title, "category": str(categorize(t.title)).split(".")[-1]}
for t in [root-->](?:Todo)];
}
cl def:pub app() -> JsxElement {
has items: list = [];
async can with entry { items = await get_todos(); }
return <div>{[<p key={i.title}>{i.title} ({i.category})</p>
for i in items]}</div>;
}
This single file defines a persistent data model, an AI-powered categorizer, a REST API, and a React frontend. No database setup. No prompt engineering. No separate frontend project. Just Jac.
You can actually run this example
Save the code above as main.jac, then create a jac.toml in the same directory:
[project]
name = "my-app"
[dependencies.npm]
jac-client-node = "1.0.4"
[dependencies.npm.dev]
"@jac-client/dev-deps" = "1.0.0"
[serve]
base_route_app = "app"
[plugins.client]
[plugins.byllm.model]
default_model = "claude-sonnet-4-20250514"
Install Jac, set your API key, and run:
Open http://localhost:8000 to see it running. Jac supports any LiteLLM-compatible model -- use gemini/gemini-2.5-flash for a free alternative or ollama/llama3.2:1b for local models.
The Vision#
Programming today demands too much from developers that isn't their problem to solve. You want to build a product, but first you have to pick a backend language, a frontend framework, a database, an ORM, a deployment target, and then glue them all together. If you want AI, add prompt engineering to the list. If you want scale, add DevOps.
Jac takes a different approach: move complexity out of the developer's code and into the language runtime. The things that can be automated -- database schemas, API serialization, client-server communication, prompt construction, deployment orchestration -- should be automated. The developer should focus on what the application does, not how the plumbing works.
This philosophy rests on three pillars.
Three Pillars#
-
One Language
Write frontend, backend, and native code in a single language. Jac's codespace system lets you target the server (
sv), browser (cl), or native binary (na) from the same file. The compiler handles interop -- HTTP calls, serialization, type sharing -- so you never write glue code.How Codespaces Work · Full-Stack Reference · See Jac vs a Traditional Stack
-
AI Native
Integrate LLMs at the language level with
by llm()-- the compiler extracts semantics from your function names, types, andsemannotations to construct prompts automatically. First-class graphs and walkers give you an expressive agentic programming model where AI agents traverse structured state spaces with tool-calling built in.How by/sem Work · AI Integration Reference · Agentic Patterns
-
Scale Native
Your code doesn't change when you move from laptop to cloud. Declare
nodetypes and connect them toroot-- the runtime handles persistence automatically. Runjac start --scaleand your app deploys to Kubernetes with Redis, MongoDB, load balancing, and health checks provisioned for you. Zero DevOps.How Persistence Works · Deployment Reference · jac-scale Plugin
One Language: Frontend, Backend, Native#
Jac introduces codespaces -- regions of code that target different execution environments. Instead of maintaining separate projects in separate languages, you write everything in Jac and the compiler produces the right output for each target:
| Codespace | Target | Ecosystem | Syntax |
|---|---|---|---|
| Server | Python runtime | PyPI (numpy, pandas, fastapi) |
sv { } or .sv.jac |
| Client | Browser/JavaScript | npm (react, tailwind, @mui) |
cl { } or .cl.jac |
| Native | Compiled binary | C ABI | na { } or .na.jac |
Server definitions are visible to client blocks. When the client calls a server function, the compiler generates the HTTP request, serialization, and routing automatically. You write one language; the compiler produces the interop layer.
See it in action
Want to see exactly how much code Jac eliminates? Check out Jac vs Traditional Stack -- a side-by-side comparison showing ~30 lines of Jac vs >300 lines of Python + FastAPI + SQLite + TypeScript + React for the same Todo app.
AI Native: LLMs as Code Constructs#
Jac's approach to AI is called Meaning Typed Programming. Instead of writing prompts in strings and parsing responses manually, you declare what you want through function signatures and let the compiler handle the how:
# The function name, types, and return type ARE the specification
def classify_sentiment(text: str) -> str by llm;
# Enums constrain the LLM to valid outputs
enum Priority { LOW, MEDIUM, HIGH, CRITICAL }
def triage_ticket(description: str) -> Priority by llm();
# sem provides additional semantic context where names aren't enough
obj Ingredient {
has name: str, cost: float, carby: bool;
}
sem Ingredient.cost = "Estimated cost in USD";
sem Ingredient.carby = "True if high in carbohydrates";
def plan_shopping(recipe: str) -> list[Ingredient] by llm();
The return type serves as the output contract -- enum means the LLM can only produce one of its values, obj means every field must be filled. No parsing code. No validation code. The type system enforces correctness.
For agentic workflows, Jac's graph constructs (nodes, edges, walkers) naturally model AI agents that traverse structured state spaces, make decisions with by llm(), and call tools:
def get_weather(city: str) -> str { return fetch_weather_api(city); }
def search_web(query: str) -> list[str] { return web_search_api(query); }
# The LLM decides which tools to call and in what order
def answer_question(question: str) -> str
by llm(tools=[get_weather, search_web]);
byLLM Quickstart Tutorial · Agentic AI Tutorial
Scale Native: No Code Changes from Laptop to Cloud#
Every Jac program has a built-in root node. Nodes reachable from root are persistent -- they survive process restarts. The runtime generates storage schemas from your node declarations. You never write database code:
node Todo { has title: str, done: bool = False; }
with entry {
root ++> Todo(title="Learn Jac"); # Automatically persisted
}
This same program runs three ways with no code changes:
| Command | What Happens |
|---|---|
jac app.jac |
Runs locally, SQLite persistence |
jac start app.jac |
HTTP API server, walkers become REST endpoints |
jac start --scale |
Kubernetes deployment with Redis, MongoDB, load balancing |
The runtime handles database schemas, user authentication (per-user graph isolation), API generation (Swagger docs at /docs), caching tiers, and Kubernetes orchestration. You write application logic; the runtime handles infrastructure.
Production Deployment Tutorial · Kubernetes Tutorial
Get Started in 5 Minutes#
Step 1: Install#
This installs the complete Jac ecosystem: jaclang (compiler), byllm (AI integration), jac-client (frontend), jac-scale (deployment), and jac-super (enhanced console).
Verify your installation:
This also warms the cache, making subsequent commands faster.
Step 2: Create Your First Program#
Create hello.jac:
Step 3: Run It#
Note: jac is shorthand for jac run -- both work identically.
That's it! You just ran your first Jac program.
Choose Your Path#
-
Just want to try it?
Follow the Installation guide to get set up and run your first program in 2 minutes.
-
Building a web app?
Jump to Build an AI Day Planner -- a complete 7-part tutorial covering backend, frontend, persistence, auth, and AI.
-
Working with AI/LLMs?
Start with the byLLM Quickstart, then explore Agentic AI for tool-calling agents and multi-agent systems.
-
Interested in graphs and OSP?
Read What Makes Jac Different for the concepts, then the OSP Tutorial for hands-on practice with nodes, edges, and walkers.
Who is Jac For?#
Jac is designed for developers who want to build AI-powered applications without the complexity of managing multiple languages and tools. If you've ever wished you could write your frontend, backend, AI logic, and deployment config in one place -- Jac is for you.
| You Are | Jac Gives You |
|---|---|
| Startup Founder | Ship complete products faster -- one language, one deploy command |
| AI/ML Engineer | Native LLM integration without prompt engineering overhead |
| Full-Stack Developer | React frontend + Python backend, no context switching |
| Python Developer | Familiar syntax with powerful new capabilities (Jac supersets Python) |
| Student/Learner | Modern language designed for clarity, with clean syntax AI models can read and write |
What You Should Know
Jac supersets Python, so Python familiarity is assumed throughout these docs. If you plan to use the full-stack features, basic React/JSX knowledge helps. No graph database experience is needed -- Jac teaches you that.
Quick Links#
| Resource | Description |
|---|---|
| Installation | Setup, first program, scaffolding, and Jacpacks |
| What Makes Jac Different | The three core concepts: codespaces, OSP, and AI integration |
| Syntax Cheatsheet | Comprehensive syntax reference |
| Build an AI Day Planner | Complete 7-part tutorial covering all Jac features |
| Language Reference | Complete language documentation |
| CLI Reference | All jac commands |
| FAQ | Learning paths by experience level |
Need Help?#
- Discord: Join our community server for questions and discussions
- GitHub: Report issues at Jaseci-Labs/jaseci
- JacGPT: Ask questions at jac-gpt.jaseci.org