๋ฉ”์ธ ์ฝ˜ํ…์ธ ๋กœ ๊ฑด๋„ˆ๋›ฐ๊ธฐ
Sonamu๋Š” Vercel AI SDK๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜๋ฉฐ, ๋‹ค์–‘ํ•œ AI ๊ธฐ๋Šฅ์„ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค. ํ…์ŠคํŠธ ์ƒ์„ฑ, ์ŠคํŠธ๋ฆฌ๋ฐ, ๋„๊ตฌ ํ˜ธ์ถœ, ์Œ์„ฑ ์ธ์‹ ๋“ฑ์„ ์‰ฝ๊ฒŒ ๊ตฌํ˜„ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

Vercel AI SDK

Sonamu๊ฐ€ ์‚ฌ์šฉํ•˜๋Š” AI ํ”„๋ ˆ์ž„์›Œํฌ์ž…๋‹ˆ๋‹ค. ์ฃผ์š” ๊ธฐ๋Šฅ:
  • ํ…์ŠคํŠธ ์ƒ์„ฑ (Text Generation)
  • ์ŠคํŠธ๋ฆฌ๋ฐ ์‘๋‹ต (Streaming)
  • ๋„๊ตฌ ํ˜ธ์ถœ (Tool Calling)
  • ๊ตฌ์กฐํ™”๋œ ์ถœ๋ ฅ (Structured Output)
  • ์Œ์„ฑ ์ธ์‹ (Transcription)

ํ…์ŠคํŠธ ์ƒ์„ฑ

generateText()

์ผ๋ฐ˜ ํ…์ŠคํŠธ๋ฅผ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';

const result = await generateText({
  model: openai('gpt-4o'),
  prompt: 'TypeScript๋กœ Express ์„œ๋ฒ„๋ฅผ ๋งŒ๋“œ๋Š” ๋ฐฉ๋ฒ•์„ ์•Œ๋ ค์ค˜',
});

console.log(result.text);
// => "Express ์„œ๋ฒ„๋ฅผ ๋งŒ๋“ค๋ ค๋ฉด..."

๋ฉ”์‹œ์ง€ ๊ธฐ๋ฐ˜ ๋Œ€ํ™”

const result = await generateText({
  model: openai('gpt-4o'),
  messages: [
    { role: 'system', content: '๋‹น์‹ ์€ ์นœ์ ˆํ•œ ํ”„๋กœ๊ทธ๋ž˜๋ฐ ๋„์šฐ๋ฏธ์ž…๋‹ˆ๋‹ค.' },
    { role: 'user', content: 'TypeScript๋ž€?' },
    { role: 'assistant', content: 'TypeScript๋Š” JavaScript์— ํƒ€์ž…์„ ์ถ”๊ฐ€ํ•œ...' },
    { role: 'user', content: '์žฅ์ ์ด ๋ญ์•ผ?' },
  ],
});

์ŠคํŠธ๋ฆฌ๋ฐ ์‘๋‹ต

streamText()

์‹ค์‹œ๊ฐ„์œผ๋กœ ํ…์ŠคํŠธ๋ฅผ ์ŠคํŠธ๋ฆฌ๋ฐํ•ฉ๋‹ˆ๋‹ค.
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';

const result = streamText({
  model: openai('gpt-4o'),
  prompt: '๊ธด ์ด์•ผ๊ธฐ๋ฅผ ๋“ค๋ ค์ค˜',
});

// ์ŠคํŠธ๋ฆผ ์ฒ˜๋ฆฌ
for await (const chunk of result.textStream) {
  process.stdout.write(chunk);
}

SSE์™€ ํ†ตํ•ฉ

import { BaseModel, stream, api } from "sonamu";
import { openai } from '@ai-sdk/openai';
import { streamText } from 'ai';
import { z } from "zod";

class ChatModelClass extends BaseModel {
  @stream({
    type: 'sse',
    events: z.object({
      chunk: z.object({
        text: z.string(),
      }),
      complete: z.object({
        totalTokens: z.number(),
      }),
    })
  })
  @api({ compress: false })
  async *streamChat(message: string, ctx: Context) {
    const sse = ctx.createSSE(
      z.object({
        chunk: z.object({
          text: z.string(),
        }),
        complete: z.object({
          totalTokens: z.number(),
        }),
      })
    );
    
    try {
      const result = streamText({
        model: openai('gpt-4o'),
        messages: [
          { role: 'user', content: message },
        ],
      });
      
      // ์‹ค์‹œ๊ฐ„ ์ „์†ก
      for await (const chunk of result.textStream) {
        sse.publish('chunk', { text: chunk });
      }
      
      // ์™„๋ฃŒ ํ†ต๊ณ„
      const usage = await result.usage;
      sse.publish('complete', {
        totalTokens: usage.totalTokens,
      });
    } finally {
      await sse.end();
    }
  }
}

๋„๊ตฌ ํ˜ธ์ถœ

๋‹จ์ผ ๋„๊ตฌ

import { openai } from '@ai-sdk/openai';
import { generateText, tool } from 'ai';
import { z } from 'zod';

const result = await generateText({
  model: openai('gpt-4o'),
  prompt: '์„œ์šธ์˜ ํ˜„์žฌ ๋‚ ์”จ๋ฅผ ์•Œ๋ ค์ค˜',
  tools: {
    getWeather: tool({
      description: 'ํŠน์ • ๋„์‹œ์˜ ํ˜„์žฌ ๋‚ ์”จ๋ฅผ ์กฐํšŒํ•ฉ๋‹ˆ๋‹ค',
      parameters: z.object({
        city: z.string().describe('๋„์‹œ ์ด๋ฆ„'),
      }),
      execute: async ({ city }) => {
        // ๋‚ ์”จ API ํ˜ธ์ถœ
        const weather = await fetchWeather(city);
        return {
          temperature: weather.temp,
          condition: weather.condition,
        };
      },
    }),
  },
  maxSteps: 5,  // ์ตœ๋Œ€ ๋„๊ตฌ ํ˜ธ์ถœ ํšŸ์ˆ˜
});

console.log(result.text);
// => "์„œ์šธ์˜ ํ˜„์žฌ ๋‚ ์”จ๋Š” ๋ง‘๊ณ  ๊ธฐ์˜จ์€ 15๋„์ž…๋‹ˆ๋‹ค."

๋‹ค์ค‘ ๋„๊ตฌ

const result = await generateText({
  model: openai('gpt-4o'),
  prompt: '์„œ์šธ์˜ ๋‚ ์”จ๋ฅผ ๋ณด๊ณ  ์šฐ์‚ฐ์ด ํ•„์š”ํ•œ์ง€ ์•Œ๋ ค์ค˜',
  tools: {
    getWeather: tool({
      description: '๋‚ ์”จ ์กฐํšŒ',
      parameters: z.object({
        city: z.string(),
      }),
      execute: async ({ city }) => {
        return await fetchWeather(city);
      },
    }),
    checkUmbrella: tool({
      description: '๋‚ ์”จ ์ •๋ณด๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ์šฐ์‚ฐ ํ•„์š” ์—ฌ๋ถ€ ํŒ๋‹จ',
      parameters: z.object({
        condition: z.string().describe('๋‚ ์”จ ์ƒํƒœ'),
      }),
      execute: async ({ condition }) => {
        return {
          needUmbrella: ['rain', 'snow'].includes(condition),
        };
      },
    }),
  },
  maxSteps: 10,
});

๊ตฌ์กฐํ™”๋œ ์ถœ๋ ฅ

generateObject()

JSON ํ˜•์‹์˜ ๊ตฌ์กฐํ™”๋œ ๋ฐ์ดํ„ฐ๋ฅผ ์ƒ์„ฑํ•ฉ๋‹ˆ๋‹ค.
import { openai } from '@ai-sdk/openai';
import { generateObject } from 'ai';
import { z } from 'zod';

const result = await generateObject({
  model: openai('gpt-4o'),
  schema: z.object({
    name: z.string(),
    age: z.number(),
    hobbies: z.array(z.string()),
    address: z.object({
      city: z.string(),
      country: z.string(),
    }),
  }),
  prompt: 'ํ™๊ธธ๋™์— ๋Œ€ํ•œ ์ •๋ณด๋ฅผ ์ƒ์„ฑํ•ด์ค˜',
});

console.log(result.object);
// {
//   name: "ํ™๊ธธ๋™",
//   age: 30,
//   hobbies: ["๋…์„œ", "์—ฌํ–‰"],
//   address: {
//     city: "์„œ์šธ",
//     country: "๋Œ€ํ•œ๋ฏผ๊ตญ"
//   }
// }

์‹ค์ „ ์˜ˆ์ œ

import { BaseModel, api } from "sonamu";
import { openai } from '@ai-sdk/openai';
import { generateObject } from 'ai';
import { z } from 'zod';

class ProductModelClass extends BaseModel {
  @api({ httpMethod: 'POST' })
  async generateProductDescription(productName: string) {
    const result = await generateObject({
      model: openai('gpt-4o'),
      schema: z.object({
        title: z.string(),
        description: z.string(),
        features: z.array(z.string()),
        price: z.number(),
        tags: z.array(z.string()),
      }),
      prompt: `${productName}์— ๋Œ€ํ•œ ์ƒํ’ˆ ์„ค๋ช…์„ ์ƒ์„ฑํ•ด์ค˜`,
    });
    
    // DB์— ์ €์žฅ
    const product = await this.saveOne({
      name: result.object.title,
      description: result.object.description,
      features: result.object.features,
      price: result.object.price,
      tags: result.object.tags,
    });
    
    return product;
  }
}

Rtzr Provider (์Œ์„ฑ ์ธ์‹)

Sonamu๋Š” Rtzr (ํ•œ๊ตญ์–ด ์Œ์„ฑ ์ธ์‹ ์„œ๋น„์Šค)๋ฅผ ๊ธฐ๋ณธ ์ œ๊ณตํ•ฉ๋‹ˆ๋‹ค.

์„ค์ •

RTZR_CLIENT_ID=your_client_id
RTZR_CLIENT_SECRET=your_client_secret

๊ธฐ๋ณธ ์‚ฌ์šฉ๋ฒ•

import { rtzr } from 'sonamu/ai/providers/rtzr';

const model = rtzr.transcription('whisper');

const result = await model.doGenerate({
  audio: audioBuffer,  // Uint8Array ๋˜๋Š” Base64
  mediaType: 'audio/wav',
});

console.log(result.text);
// => "์•ˆ๋…•ํ•˜์„ธ์š”, ์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ์ข‹๋„ค์š”"

console.log(result.segments);
// [
//   { text: "์•ˆ๋…•ํ•˜์„ธ์š”", startSecond: 0, endSecond: 1 },
//   { text: "์˜ค๋Š˜ ๋‚ ์”จ๊ฐ€ ์ข‹๋„ค์š”", startSecond: 1, endSecond: 3 }
// ]

ํŒŒ์ผ ์—…๋กœ๋“œ + ์Œ์„ฑ ์ธ์‹

import { BaseModel, upload, api } from "sonamu";
import { rtzr } from 'sonamu/ai/providers/rtzr';

class TranscriptionModelClass extends BaseModel {
  @upload({ mode: 'single' })
  @api({ httpMethod: 'POST' })
  async transcribeAudio() {
    const { file } = Sonamu.getUploadContext();
    
    if (!file) {
      throw new Error('์˜ค๋””์˜ค ํŒŒ์ผ์ด ์—†์Šต๋‹ˆ๋‹ค');
    }
    
    // ์Œ์„ฑ ์ธ์‹
    const model = rtzr.transcription('whisper');
    const buffer = await file.toBuffer();
    
    const result = await model.doGenerate({
      audio: buffer,
      mediaType: file.mimetype,
    });
    
    // DB์— ์ €์žฅ
    await this.saveOne({
      audio_url: file.url,
      transcription: result.text,
      segments: result.segments,
      language: result.language,
      duration: result.durationInSeconds,
    });
    
    return {
      text: result.text,
      segments: result.segments,
    };
  }
}

Rtzr ์˜ต์…˜

const result = await model.doGenerate({
  audio: audioBuffer,
  mediaType: 'audio/wav',
  providerOptions: {
    rtzr: {
      domain: 'GENERAL',  // 'CALL' | 'GENERAL'
      language: 'ko',
      diarization: true,  // ํ™”์ž ๋ถ„๋ฆฌ
      wordTimestamp: true,  // ๋‹จ์–ด๋ณ„ ํƒ€์ž„์Šคํƒฌํ”„
      profanityFilter: false,  // ์š•์„ค ํ•„ํ„ฐ
    }
  }
});

๋ฉ€ํ‹ฐ๋ชจ๋‹ฌ (์ด๋ฏธ์ง€ ์ฒ˜๋ฆฌ)

GPT-4o๋Š” ์ด๋ฏธ์ง€๋ฅผ ์ž…๋ ฅ์œผ๋กœ ๋ฐ›์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';

const result = await generateText({
  model: openai('gpt-4o'),
  messages: [
    {
      role: 'user',
      content: [
        { type: 'text', text: '์ด ์ด๋ฏธ์ง€์— ๋ฌด์—‡์ด ์žˆ๋‚˜์š”?' },
        {
          type: 'image',
          image: imageBuffer,  // Uint8Array ๋˜๋Š” URL
        },
      ],
    },
  ],
});

console.log(result.text);
// => "์ด๋ฏธ์ง€์—๋Š” ๊ณ ์–‘์ด๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค..."

์ด๋ฏธ์ง€ ์—…๋กœ๋“œ + ๋ถ„์„

import { BaseModel, upload, api } from "sonamu";
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';

class ImageAnalysisModelClass extends BaseModel {
  @upload({ mode: 'single' })
  @api({ httpMethod: 'POST' })
  async analyzeImage() {
    const { file } = Sonamu.getUploadContext();
    
    if (!file || !file.mimetype.startsWith('image/')) {
      throw new Error('์ด๋ฏธ์ง€ ํŒŒ์ผ์ด ํ•„์š”ํ•ฉ๋‹ˆ๋‹ค');
    }
    
    const buffer = await file.toBuffer();
    
    const result = await generateText({
      model: openai('gpt-4o'),
      messages: [
        {
          role: 'user',
          content: [
            { type: 'text', text: '์ด ์ด๋ฏธ์ง€๋ฅผ ์ž์„ธํžˆ ๋ถ„์„ํ•ด์ค˜' },
            { type: 'image', image: buffer },
          ],
        },
      ],
    });
    
    return {
      analysis: result.text,
      imageUrl: file.url,
    };
  }
}

์—๋Ÿฌ ์ฒ˜๋ฆฌ

import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';

try {
  const result = await generateText({
    model: openai('gpt-4o'),
    prompt: '...',
  });
  
  return result.text;
} catch (error) {
  if (error.name === 'AI_APICallError') {
    // API ํ˜ธ์ถœ ์—๋Ÿฌ
    console.error('API Error:', error.message);
    console.error('Status:', error.statusCode);
  } else if (error.name === 'AI_InvalidPromptError') {
    // ํ”„๋กฌํ”„ํŠธ ์—๋Ÿฌ
    console.error('Invalid Prompt:', error.message);
  } else {
    // ๊ธฐํƒ€ ์—๋Ÿฌ
    console.error('Unknown Error:', error);
  }
  
  throw error;
}

๋น„์šฉ ์ถ”์ 

const result = await generateText({
  model: openai('gpt-4o'),
  prompt: '...',
});

// ํ† ํฐ ์‚ฌ์šฉ๋Ÿ‰
console.log('Prompt Tokens:', result.usage.promptTokens);
console.log('Completion Tokens:', result.usage.completionTokens);
console.log('Total Tokens:', result.usage.totalTokens);

// ๋น„์šฉ ๊ณ„์‚ฐ (์˜ˆ์‹œ)
const costPerToken = 0.00003;  // GPT-4o ๊ฐ€๊ฒฉ
const cost = result.usage.totalTokens * costPerToken;
console.log('Cost:', cost);

์‹ค์ „ ํ†ตํ•ฉ ์˜ˆ์ œ

AI ์ฑ„ํŒ… API

import { BaseModel, api } from "sonamu";
import { openai } from '@ai-sdk/openai';
import { generateText } from 'ai';
import { z } from 'zod';

class ChatModelClass extends BaseModel {
  @api({ httpMethod: 'POST' })
  async chat(
    message: string,
    conversationId: number | null,
    ctx: Context
  ) {
    // ๋Œ€ํ™” ์ด๋ ฅ ์กฐํšŒ
    const history = conversationId
      ? await ConversationModel.findById(conversationId)
      : null;
    
    const messages = history?.messages || [];
    messages.push({
      role: 'user',
      content: message,
    });
    
    // AI ์‘๋‹ต ์ƒ์„ฑ
    const result = await generateText({
      model: openai('gpt-4o'),
      messages: [
        {
          role: 'system',
          content: '๋‹น์‹ ์€ ์นœ์ ˆํ•œ ๊ณ ๊ฐ ์ง€์› ์ฑ—๋ด‡์ž…๋‹ˆ๋‹ค.',
        },
        ...messages,
      ],
      temperature: 0.7,
      maxTokens: 500,
    });
    
    // ์‘๋‹ต ์ €์žฅ
    messages.push({
      role: 'assistant',
      content: result.text,
    });
    
    const conversation = await ConversationModel.saveOne({
      id: conversationId,
      user_id: ctx.user.id,
      messages,
      token_usage: result.usage.totalTokens,
    });
    
    return {
      conversationId: conversation.id,
      message: result.text,
      usage: result.usage,
    };
  }
}

์ฃผ์˜์‚ฌํ•ญ

AI SDK ์‚ฌ์šฉ ์‹œ ์ฃผ์˜์‚ฌํ•ญ:
  1. API ํ‚ค ๋ณด์•ˆ: ํ™˜๊ฒฝ๋ณ€์ˆ˜ ์‚ฌ์šฉ
    // โŒ ํ•˜๋“œ์ฝ”๋”ฉ
    const model = openai('gpt-4o', { apiKey: 'sk-...' });
    
    // โœ… ํ™˜๊ฒฝ๋ณ€์ˆ˜
    const model = openai('gpt-4o');  // OPENAI_API_KEY ์ž๋™ ์‚ฌ์šฉ
    
  2. ์—๋Ÿฌ ์ฒ˜๋ฆฌ: ํ•ญ์ƒ try-catch
    try {
      const result = await generateText({ ... });
    } catch (error) {
      console.error(error);
    }
    
  3. ํ† ํฐ ์ œํ•œ: maxTokens ์„ค์ •
    generateText({
      model: openai('gpt-4o'),
      prompt: '...',
      maxTokens: 1000,  // ๋น„์šฉ ์ œ์–ด
    });
    
  4. ์ŠคํŠธ๋ฆฌ๋ฐ ์ •๋ฆฌ: ์—๋Ÿฌ ์‹œ์—๋„ ์ŠคํŠธ๋ฆผ ์ข…๋ฃŒ
    try {
      for await (const chunk of result.textStream) {
        // ...
      }
    } finally {
      // ์ •๋ฆฌ ์ž‘์—…
    }
    
  5. Rtzr ํŒŒ์ผ ํฌ๊ธฐ: ํฐ ํŒŒ์ผ์€ ์ฒญํ‚น ํ•„์š”
    if (file.size > 10 * 1024 * 1024) {
      throw new Error('ํŒŒ์ผ ํฌ๊ธฐ๋Š” 10MB ์ดํ•˜์—ฌ์•ผ ํ•ฉ๋‹ˆ๋‹ค');
    }
    
  6. ์ด๋ฏธ์ง€ ํฌ๊ธฐ: GPT-4o ์ด๋ฏธ์ง€ ์ œํ•œ ํ™•์ธ
    // ์ด๋ฏธ์ง€ ํฌ๊ธฐ ์ œํ•œ (20MB)
    if (imageBuffer.length > 20 * 1024 * 1024) {
      throw new Error('์ด๋ฏธ์ง€๊ฐ€ ๋„ˆ๋ฌด ํฝ๋‹ˆ๋‹ค');
    }
    

Vercel AI SDK ๋ฌธ์„œ

๋” ๋งŽ์€ ๊ธฐ๋Šฅ์€ ๊ณต์‹ ๋ฌธ์„œ๋ฅผ ์ฐธ๊ณ ํ•˜์„ธ์š”:

๋‹ค์Œ ๋‹จ๊ณ„