Complete DevOps Bootcamp: Master DevOps in 12 Weeks
Spring AISetup And Configuration

Setting the SpringBoot with SpringAI Project


The SpringBoot application integrated with Spring AI enables seamless integration with multiple AI model providers including OpenAI, Anthropic (Claude), and Ollama (local models), allowing developers to leverage AI capabilities within their Spring applications.

Prerequisites

RequirementVersionPurpose
Java21 or higherRequired runtime for Spring Boot
IntelliJ IDEACommunity/UltimateIDE for development
MavenLatestDependency management
Internet Connection-For downloading dependencies and accessing cloud AI APIs

IntelliJ Community Edition doesn't support direct Spring project creation, so we'll use Spring Initializer web interface.


Project Creation

Step 1: Initialize Project Using Spring Initializer

  1. Navigate to https://start.spring.io
  2. Configure project metadata:
SettingValue
ProjectMaven
LanguageJava
Spring BootLatest stable version
Java Version21
PackagingJar
Groupcom.example (or your preference)
Artifactspring-ai-demo (or your preference)

Step 2: Add Required Dependencies

Select the following dependencies from the available options:

1. Spring Web

  • Purpose: Provides RESTful web services capabilities
  • Usage: Creates controller endpoints for AI model interactions

2. OpenAI

  • Purpose: Integration with OpenAI models (GPT-3.5, GPT-4, etc.)
  • Type: Cloud-based API
  • Requires: API key

3. Ollama

  • Purpose: Integration with local open-source LLMs
  • Type: Local execution
  • Requires: Ollama installation on local machine

4. Anthropic Claude

  • Purpose: Integration with Anthropic's Claude models
  • Type: Cloud-based API
  • Requires: API key

Step 3: Generate and Download Project

  1. Click "Generate" button
  2. Download the ZIP file
  3. Extract to your preferred workspace directory

Step 4: Open in IntelliJ IDEA

  1. Open IntelliJ IDEA
  2. Select "Open" or "Import Project"
  3. Navigate to the extracted project folder
  4. Select the project root directory (containing pom.xml)
  5. Click "OK" and wait for Maven to download dependencies

Configuration Setup

Location: application.properties

Navigate to src/main/resources/application.properties to configure API keys and model settings.

1. OpenAI Configuration

# OpenAI API Key (Required)
spring.ai.openai.api-key=<YOUR_OPENAI_API_KEY>

# Optional: Specify model (default: gpt-3.5-turbo)
spring.ai.openai.chat.options.model=gpt-4

2. Anthropic Configuration

# Anthropic API Key (Required)
spring.ai.anthropic.api-key=<YOUR_ANTHROPIC_API_KEY>

# Optional: Specify Claude model (default: claude-2)
spring.ai.anthropic.chat.options.model=claude-3-opus

3. Ollama Configuration (Local Models)

# Ollama Model Configuration (No API key required)
spring.ai.ollama.chat.options.model=deepseek-r1:7b

# Alternative models you can configure:
# spring.ai.ollama.chat.options.model=mistral:latest
# spring.ai.ollama.chat.options.model=llama3.2:latest
# spring.ai.ollama.chat.options.model=gemma:7b

Application Architecture

Controller Structure

The application uses separate controller files for each AI model provider, enabling modular and maintainable code:

src/main/java/com/telusko/SpringAIDemo
├── controller/
│   ├── OpenAIController.java      # Handles OpenAI requests
│   ├── AnthropicController.java   # Handles Anthropic requests
│   └── OllamaController.java      # Handles Ollama requests
└── SpringAiDemoApplication.java   # Main application class

URL Routing Architecture

Each controller exposes separate endpoints:

ControllerEndpointPurpose
OpenAIController/api/openaiRoutes requests to OpenAI models
AnthropicController/api/anthropicRoutes requests to Anthropic models
OllamaController/api/ollamaRoutes requests to local Ollama models

Example Controller Structure

@RestController
@RequestMapping("/api")
public class OpenAIController {

    @Autowired
    private ChatClient chatClient;

    @PostMapping("/openai")
    public String chat(@RequestBody String prompt) {
        return chatClient.call(prompt);
    }
}

Common Issues and Troubleshooting

Issue 1: "Failed to Fetch" Error in UI

Symptom: UI displays "failed to fetch" when submitting requests.

Causes and Solutions:

CauseSolution
Backend not runningStart Spring Boot application
Port mismatchVerify UI points to correct backend port
CORS configurationAdd CORS configuration in Spring Boot
Missing API keysCheck application.properties for valid keys

Issue 2: OpenAI/Anthropic API Key Error

Symptom:

401 Unauthorized: Invalid API key

Solution:

  1. Verify API key is correct (no extra spaces)
  2. Check API key has not expired
  3. Ensure account has sufficient credits
  4. Verify key format matches provider requirements

Issue 3: Ollama Connection Error

Symptom:

Connection refused: http://localhost:11434

Solution:

  1. Start Ollama service: ollama serve
  2. Verify Ollama is running: ollama list
  3. Check model is downloaded: ollama pull deepseek-r1:7b
  4. Ensure no firewall blocking localhost:11434

Issue 4: Dependency Download Failures

Symptom: Maven fails to download Spring AI dependencies.

Solution:

  1. Check internet connection
  2. Update Maven settings.xml
  3. Clear Maven cache: mvn clean install -U
  4. Use latest Spring Boot version compatible with Spring AI

Model Configuration Options

Customizing Default Models

You can specify different models for each provider:

# OpenAI Models
spring.ai.openai.chat.options.model=gpt-3.5-turbo  # Faster, cheaper
spring.ai.openai.chat.options.model=gpt-4          # More capable
spring.ai.openai.chat.options.model=gpt-4-turbo    # Latest features

# Anthropic Models
spring.ai.anthropic.chat.options.model=claude-3-haiku   # Fastest
spring.ai.anthropic.chat.options.model=claude-3-sonnet  # Balanced
spring.ai.anthropic.chat.options.model=claude-3-opus    # Most capable

# Ollama Models (must be downloaded first)
spring.ai.ollama.chat.options.model=mistral:latest
spring.ai.ollama.chat.options.model=llama3.2:latest
spring.ai.ollama.chat.options.model=deepseek-r1:7b

Advanced Configuration Options

# Temperature (creativity) - Range: 0.0 to 1.0
spring.ai.openai.chat.options.temperature=0.7

# Max tokens (response length)
spring.ai.openai.chat.options.max-tokens=1000

# Top P (nucleus sampling)
spring.ai.openai.chat.options.top-p=1.0

API_Key_Management


Summary

  • Spring Boot project is initialized using Spring Initializer (start.spring.io) with Java 21 and required dependencies such as Spring Web, OpenAI, Anthropic, and Ollama for AI integration.

  • API keys for cloud models (OpenAI, Anthropic) are configured in application.properties, while local models like Ollama require only model name configuration without any API key.

  • Spring AI dependencies in pom.xml enable integration with multiple AI providers, allowing a unified approach to interact with different LLMs.

  • Separate REST controllers are created for each AI model (OpenAI, Anthropic, Ollama) to handle requests and route prompts to the respective models.

  • Spring AI abstracts model interactions, enabling seamless communication with AI models through configuration and dependency injection without handling low-level API logic.

Written By: Muskan Garg

How is this guide?

Last updated on