All notable changes to this project will be documented in this file.
Initial release of the OCaml AI SDK — a type-safe, provider-agnostic AI model abstraction inspired by the Vercel AI SDK, targeting AI SDK v6 wire compatibility.
ai_provider)Provider_options for compile-time type-safe provider-specific settingsPrompt types (System = string only, User = text + files, etc.)Language_model.S module type with first-class module wrapperTool, Tool_choice, Mode, Content foundation typesFinish_reason, Usage, Warning, Provider_error typesProvider.S and Middleware.S module type signaturesCall_options, Generate_result, Stream_part, Stream_result typesai_provider_anthropic)Thinking support with budget_tokens smart constructor (>= 1024)Cache_control for prompt cachingAnthropic_options via the extensible GADT systemmax_tokensai_provider_openai)ai_core)generate_text — synchronous text generation with multi-step tool loopstream_text — streaming text generation with multi-step tool loop, returns synchronously with streams filled by background Lwt taskOutput.text, Output.object_, Output.enum, Output.array, Output.choice with JSON Schema validationdata: {json}\n\n encoding with x-vercel-ai-ui-message-stream: v1 header, all v6 chunk typesUi_message_stream_writer — composable stream builder with write (synchronous) and merge (non-blocking via Lwt.async), lifecycle management, ref-counted in-flight merge tracking, on_finish callbackneeds_approval predicate on Core_tool.t, step loop partitioning, Tool_approval_request chunk type, stateless re-submission with approved_tool_call_idsStop_condition — step loop termination predicates matching upstream stopWhen: step_count_is, has_tool_call, is_met (OR semantics with short-circuit); wired through generate_text, stream_text, and server_handler; max_steps remains as independent hard safety capai-sdk-react)useChat and useCompletion hook bindings for @ai-sdk/reactdata_ui_partclassify function for part type dispatchone_shot, streaming, tool_use, thinking, generate, stream_chat, agent_loop — standalone CLI exampleschat_server — cohttp chat server with React frontend, tool approval, structured outputcustom_stream — custom data streaming with Melange frontendai-e2e — end-to-end Melange app with 11 demos (basic chat, reasoning, tool use, tool approval, client tools, file attachments, structured output, completion, web search, retry/regenerate)generate_opam_files for automated opam file generationmlx-pp / ocamlformat-mlx)