Natural language isn't
natural for agents.

60% fewer bytes. Same instruction. Every hop.
At scale that's the difference between a breakthrough and a bill.

The Octid Semantic Mesh Protocol (OSMP) is an open encoding standard that compresses agentic AI instructions 60–87% and decodes by table lookup — no inference required at the receiving node. Any device. Any channel.

60–87%
Byte reduction
51B
LoRa floor
3 SDKs
Py · TS · Go
0
Inference at decode

NL takes a big byte.

Every inter-agent instruction today is a verbose, cloud-dependent, inference-heavy payload. That works fine in a data center, but fails completely under edge constraints like the LoRa radio floor. Tokens get torched at scale, and natural language locks sovereign edge systems into infrastructure designed for full-scale models — not the lightweight agents that actually run there.

Natural language / JSON
"If heart rate at node 1 exceeds 120, assemble casualty report and broadcast evacuation to all nodes."
→ 100 bytes. Requires inference to parse.
Fails at 51-byte LoRa floor. Costs tokens at every hop.
100 bytesinference required
OSMP / SAL
H:HR@NODE1>120→H:CASREP∧M:EVA@*
→ 35 bytes. Decode is a table lookup.
Fits a single LoRa packet. Zero inference cost at receive.
35 bytes65% smaller

The benchmark is in the repo.

48-vector canonical conformance suite. All three SDKs verified independently. Wire compatible across Python, TypeScript, and Go.

Byte reduction by instruction domain
Mean UTF-8 compression vs natural language
Municipal
82%
Emergency
81%
Clinical
75%
Financial
70%
Agentic
65%
Parallel
50%
Mean
60.6%
LoRa floor compatibility
Instructions fitting 51-byte single-packet constraint
82% single packet
Fit in 51 bytes39 / 48
Multi-packet instructions9 / 48
Decode errors0 / 48
Cross-SDK conformance
Wire compatible — identical decode results on 86-instruction corpus
Python
60.6% · 122/122 tests
TypeScript
60.7% · 112/112 tests
Go
60.7% · 10/10 tests
Tier 2
86/86 wire compatible
Two-tier corpus compression
SAL encoding + LZMA on a knowledge corpus at rest
NL + LZMA baseline
OSMP + LZMA (two-tier)3.7×
72.7%
total corpus reduction

Built for the builders
advancing the horizon.

Open source inventors and entrepreneurs who want efficiency to compound their growth — not their costs.

OpenClaw Developers
Replace JSON payloads with OSMP encoding. Same framework. 60–87% smaller messages. TypeScript SDK today. Safe to adopt unilaterally — falls back automatically if the other end isn't there yet.
PicoClaw & Edge Hardware
Go SDK compiles the full dictionary into the binary. No network. No file I/O. No cloud. A $10 edge device is a sovereign OSMP node. Boot cold, decode immediately.
Meshtastic Builders
OSMP encodes the payload Meshtastic carries. Transport unchanged. Agentic instructions at LoRa range. Single packet. C++ integration is an open contribution target — spec is complete.
Enterprise Agent Mesh
Every inter-agent hop burns tokens. OSMP compresses the wire layer — context window reduction, egress savings, lower inference cost. The math compounds in your favor instead of against you.

Encode once.
Decode anywhere.

01 — Encode
LLM outputs OSMP
Your LLM already handles structured output. Point the system prompt at the OSMP grammar and dictionary. No new tooling. No migration. No framework changes required.
02 — Transmit
Any channel
The same instruction that fits a 51-byte LoRa packet flows through HTTP, BLE, satellite, or wired. The protocol's floor guarantee ensures the encoding never makes a message larger than its natural language equivalent.
03 — Decode
Table lookup. No inference.
The receiving node decodes by dictionary lookup. No model. No inference step. No ambiguity resolution. Any device capable of string processing is a sovereign OSMP node.

OSMP operates beneath
every other agent protocol.

ProtocolTransportOfflineCompressionInference-free decodeLayer
MCP (Anthropic)HTTP/JSONFramework
A2A (Google)HTTPS/JSONFramework
ACP (IBM)REST/HTTPFramework
OSMPAny channel60–87%Encoding

The protocol is open.
The SDKs are shipping.

Apache 2.0. Python, TypeScript, and Go. The benchmark is in the repo.