Get started
An overview of the TurboStarter AI starter kit.
For the complete documentation index, see llms.txt. Prefer markdown by appending.mdto documentation URLs or sendingAccept: text/markdown.
TurboStarter AI is a starter kit with 10+ ready-to-use templates across web and mobile that helps you quickly build powerful AI applications without starting from scratch.
Whether you're launching a small side project or a full-scale product, it gives you the structure you need to start building immediately.
Features
TurboStarter AI comes packed with features designed to accelerate your development process:
Core framework
Monorepo setup
Powered by Turborepo for efficient code sharing and dependency management across web and mobile applications.
Next.js web app
Built with Next.js and the App Router (RSC by default), plus an opinionated structure for AI templates.
Hono API
Fast, TypeScript-first API layer shared by the web and mobile apps.
React Native + Expo
Foundation for cross-platform mobile apps that share business logic with your web application.
AI
AI SDK
Complete toolkit for implementing advanced AI features like streaming responses and interactive chat interfaces.
LangChain
Utilities for building RAG workflows like document loading, chunking, and retrieval.
Multiple AI providers
Seamless integration with OpenAI, Anthropic, Google AI, xAI, DeepSeek, Replicate, Fireworks, Eleven Labs, and more through a unified strategy.
Specialized models
Full support for text generation, structured output, image generation, embeddings (RAG), transcription, and voice synthesis.
One-line model switching
Effortlessly switch between AI models or providers with minimal code changes.
LiveKit
Real-time audio, video, and data streaming capabilities for collaborative AI and communication features.
Data storage
Drizzle ORM
Type-safe ORM for efficient interaction with PostgreSQL (default) or other supported databases (MySQL, SQLite).
PostgreSQL database
Reliable storage for chat history, user data, and vector embeddings with optimized performance.
Vector embeddings
Built-in support for storing and retrieving vector embeddings for advanced retrieval-augmented generation.
Blob storage
Integrated S3-compatible storage for managing user uploads, AI-generated content, and documents.
Authentication
Better Auth integration
Secure authentication system starting with anonymous sessions, extensible to email/password, magic links, and OAuth providers.
Rate limiting
Intelligent protection for API endpoints against abuse and overuse.
Credits-based access
Flexible system to manage and control AI feature usage with customizable credit allocation.
Backend API key management
Security-first approach ensuring sensitive API keys remain protected on the server side.
User interface
Tailwind CSS & shadcn/ui
Utility-first CSS framework and pre-designed components for rapid UI development.
Base UI
Accessible, unstyled components that provide the foundation for beautiful, functional interfaces.
Shared UI package
Centralized UI component library ensuring consistency across all applications in the monorepo.
Templates
TurboStarter AI includes several production-ready template applications that showcase diverse AI capabilities. Use these examples to understand implementation patterns and jumpstart your own projects.
Chat
Build intelligent conversations with an AI chatbot featuring contextual reasoning, web search, and shareable chats.
Voice
Build real-time voice experiences, including streaming audio, transcription, and voice agents.
Image playground
Create visuals with a versatile AI image generator for multiple models, styles, and resolutions.
Retrieval-augmented generation
Extract insights from documents by having conversations with your files using AI.
Text to speech
Convert text into lifelike speech with thousands of voices and languages.
Agents
Develop autonomous agents to execute complex tasks via multiple AI models.
Scope of this documentation
This documentation focuses specifically on the AI features, architecture, and demo applications included in the TurboStarter AI kit. While we provide comprehensive coverage of AI integrations, for information about core framework elements (authentication, billing, etc.), please refer to the Core documentation.
Our goal is to guide you through setting up, customizing, and deploying the AI starter kit efficiently. Where relevant, we include links to official documentation for the integrated AI providers and libraries.
Setup
Getting started with TurboStarter AI requires configuring the core applications first. For detailed setup instructions, refer to:
Web app setup
Follow our step-by-step guide in the Core web documentation to set up your web application.
Mobile app setup
Use our detailed guide in the Core mobile documentation to configure your mobile application.
After establishing the core applications, you can configure specific AI providers and demo applications using the dedicated sections in this documentation (see sidebar). For a quick start, you might also want to check our TurboStarter CLI guide to bootstrap your project in seconds.
When working with the AI starter kit, remember to use the ai repository instead of core for Git commands. For example, use git clone turbostarter/ai rather than git clone turbostarter/core.
Deployment
Deploying TurboStarter AI follows the same process as deploying the core web application. Ensure you configure all necessary environment variables, including those for your selected AI providers (like OpenAI, Anthropic, etc.), in your deployment environment.
For comprehensive deployment instructions across various platforms, consult our core deployment guides:
Deployment checklist
General checklist before deploying the web app.
Vercel
Streamlined deployment process for Vercel hosting.
Railway
Step-by-step guide for deploying to Railway.
Docker
Container-based deployment using Docker.
Other Providers
Additional guides for Netlify, Render, AWS Amplify, Fly.io and more.
For mobile app store deployment, refer to our mobile publishing guides:
Publishing checklist
Comprehensive pre-publishing verification for mobile applications.
iOS App Store
Publish your iOS app to the Apple App Store.
Google Play Store
Publish your Android app to the Google Play Store.
Updates
Best practices for managing updates to published mobile apps.
Each AI demo app may have specific deployment considerations, so check their dedicated documentation sections for additional guidance.
AI-assisted development
TurboStarter comes with built-in rules, skills, subagents, and commands designed specifically to make AI-enhanced development easier. These project-specific AI helpers guide large language models (LLMs) to understand your codebase, enforce best practices, and maintain consistency throughout your project.
Major AI coding assistants - such as Cursor, Claude, Codex, Antigravity, and others - work seamlessly with this setup. Simply open the TurboStarter AI project in your preferred AI tool to get intelligent code assistance right away.
Additionally, you'll find a /llms.txt file containing up-to-date, LLM-optimized documentation, which allows you to query the latest details about TurboStarter directly from your AI assistant.
If you'd like a step-by-step walkthrough, check out our AI-assisted development guide.
Let's build amazing AI SaaS!
We're excited to help you create innovative AI-powered applications quickly and efficiently. If you have questions, encounter issues, or want to showcase your creations, connect with our community:
Happy building!
How is this guide?
Last updated on





