Escape complex toolchains. Build server apps on a single platform made for AI and Human collaboration.
Duso's language and architecture are designed for working fast and effectively. Get more done with less code, zero compile time, zero dependencies.
Duso was made to use less tokens to read, code, and debug. The consistent syntax means less prompts and faster responses, saving you time and money.
Currently: OpenAI, Claude, Ollama, Azure AI, and Groq. Open source means community contributions can keep up with new AI releases.
Restrict local file access and use the built-in seamless virtual file system. Run AI agents safely in a controlled environment with managed resource access.
Multi-line string templates for prompting, built-in Markdown rendering for HTML and ANSI terminals, fast concurrent IO processing.
Fast, thread-safe, file-backed, in-memory objects that are shared between processes. Use them for storage, caching, and process communication.
Run and coordinate hundreds or thousands of agents at the same time. Duso's elegant architecture is simple to learn and manage at scale.
10MB binary built for Linux, macOS, and Windows. Use as an AI playground, dev server, or even a production server.
Single self-contained binary that has its own virtual filesystem of all its libraries, docs, and examples. Build once, run anywhere.
Includes routing, websockets, concurrent connections, https, jwt, cors, sessions, etc. Use it to build your own web apps or build your scripts into a production server.
Give your favorite AI access to coding tools inside a safe sandboxed environment. All self-contained with no external dependancies.
Interactive debugger with breakpoints, code context, and stack traces. Concurrent issues queue up so you can manage them one at a time.
Bundle your application scripts directly into a self-contained binary that runs anywhere. You can also extend with Go if you have special requirements.
Build Duso directly into your own existing applications. Give your users the power to write scripts without shipping a language runtime.
Comprehensive guides and API docs optimized for both human and AI consumption. Self-contained in your binary so docs are always available offline.
Apache 2.0 licensed and fully open source. Contribute back to the community and benefit from a growing ecosystem of libraries and tools.
Duso was designed to be quickly learned by most AI coding assistants.
# start here!
duso -read
# look up a specific function or lib
duso -doc claude
Start with a simple sample project.
duso -init myproject
One-shot Q&A style prompts are easy.
ai = require("ollama")
print(ai.prompt("sup?"))
You can add system prompt and tools, but this will get you started. Displays a cute "busy" spinner while waiting for a response.
ai = require("openai")
chat = ai.session()
while true do
prompt = input("\n\nYou: ")
if lower(prompt) == "exit" then break end
write("\n\nOpenAI: ")
busy("thinking...")
write(chat.prompt(prompt))
end
Prompts experts in parallel, gathers their responses, then runs them through a summary prompt.
ai = require("claude")
prompt = input("Ask the panel: ")
busy("asking...")
// build an array of functions to prompt
// each expert, then run them all in parallel
experts = ["Astronomer", "Astrologer", "Biologist", "Accountant"]
responses = parallel(map(experts, function(expert)
return function()
return ai.prompt(prompt, {
system = """
You are an expert {{expert}}. Always reason and
interact from this mindset. Limit your field of
knowledge to this expertise.
""",
max_tokens = 500
})
end
end))
// prepend the expert type into their response
for i = 0, 3 do
responses[i] = "{{experts[i]}} says: {{responses[i]}}"
end
busy("summarizing...")
summary = ai.prompt("""
Summarize these responses:
{{join(responses, "\n\n---\n\n")}}
List 3 the things they have in common.
Then list the 3 things that are the most different.
Don't separate with ---
""")
// format the markdown response for the terminal
print(markdown_ansi(summary))
You can run a web server from the command line.
# http://localhost:8080
duso -c 'http_server().start()'
Build an HTTP API with routing and JSON responses.
Best practice is one handler script per route to make
each more manageable and efficient at runtime. Each time
a route is triggered, http_server runs a new instance
of that script to handle it.
// server.du
port = 3000
server = http_server({port = port})
server.route("GET", "/api/user/:id", "user-get.du")
server.route("POST", "/api/user", "user-post.du")
print("Server running at http://localhost:{{port}}")
server.start()
// ...script blocks while server is running...
print("Server stopped.")
// user-get.du
ctx = context()
req = ctx.request()
res = ctx.response()
user = datastore("users").get(req.params.id)
if not user then res.error(404) end
res.json({
success = true,
data = user
}, 200)
// user-post.du
ctx = context()
req = ctx.request()
res = ctx.response()
id = uuid()
datastore("users").set(id, req.body)
res.json({
id = id,
name = req.body.name
}, 201)
Most languages prioritize human expressiveness and can be challenging for AI. For example, Python and JavaScript offer countless ways to solve the same problem, filled with subtle footguns and "magic" behavior. Their massive ecosystems with thousands of overlapping modules with hidden dependencies often confuse both humans and AI. Systems languages like Go reduce ambiguity and debugging but add complexity that slows development.
Duso is intentionally boring and predictable. No clever syntax tricks. No multiple ways to do the same thing. Every pattern is consistent and straightforward so AI can reason about code reliably, write better scripts faster, and use fewer tokens doing it. Plus, its entire runtime and ecosystem is included in a single binary. No package management or version conflicts. Built for LLMs first means everything is frictionless so you and your AI work more productively together.