AI

Learn how to use AI integration in your mobile app.

As AI integration for web, extension, and mobile is based on the same battle-tested Vercel AI SDK, the implementation is very similar across platforms.

In this section, we'll focus on how to consume AI responses in the mobile app. For server-side implementation details, please refer to the web documentation.

Features

The most common AI integration features are also supported in the mobile app:

  • Chat: Build chat interfaces inside native mobile apps.
  • Streaming: Receive AI responses as soon as the model starts generating them, without waiting for the full response to be completed.
  • Image generation: Generate images based on a given prompt.

You can easily compose your application using these building blocks or extend them to suit your specific needs.

Usage

The usage of AI integration in the mobile app is the same as for web app and browser extension. We use the exact same API endpoint, and since TurboStarter ships with built-in support for streaming on mobile, we can leverage it to display answers incrementally to the user as they're generated.

ai.tsx
import { useState } from "react";
import { api } from "~/lib/api/react";
 
const AI = () => {
  const [answer, setAnswer] = useState("");
  const { mutate, isPending } = api.ai.chat.useMutation({
    onSuccess: async (data) => {
      for await (const chunk of data) {
        setAnswer((prev) => prev + chunk);
      }
    },
  });
 
  return <Text>{answer}</Text>;
};
 
export default AI;

TurboStarter comes with a ready-to-use implementation of AI chat, so you can see this solution in action. Feel free to reuse or modify it to suit your needs.

Last updated on

On this page

Ship your startup everywhere. In minutes.Get TurboStarter