A conversational AI assistant for the DevBcn conference that helps with creating social media content, newsletters, and providing information about speakers and sessions.
The DevBcn Social Assistant is a Spring Boot application that leverages AI models (either OpenAI or Ollama) to provide information about the DevBcn conference, generate social media posts, and create newsletter content. It features a chat interface built with Vaadin where users can interact with the assistant.
The assistant can:
- Provide information about the conference (dates, venue, tracks, etc.)
- Generate social media posts for different platforms (Twitter, Instagram, LinkedIn, Bluesky)
- Create Mailchimp newsletter content
- Retrieve and present information about speakers and sessions
- Java 21: The application is built using Java 21
- Spring Boot 3.5.3: Framework for building the application
- Spring AI 1.0.0: Integration with AI models
- Support for OpenAI models (default)
- Support for Ollama models (alternative)
- Vaadin 24.8.0: Web framework for building the user interface
- Docker Compose: For running Ollama locally
- TestContainers: For testing with containerized dependencies
src/main/java/com/devbcn/socialassistant/
SocialAssistantApplication.java
: Main application classconfig/
: Configuration classesdto/
: Data transfer objects for sessions, speakers, etc.service/
: Service classes for business logicChatService.java
: Handles interactions with AI modelsSessionizeService.java
: Provides tools for the AI to access conference dataSpeakerService.java
: Manages speaker informationSessionService.java
: Manages session information
views/
: Vaadin UI componentsHomeView.java
: Main chat interface
src/main/resources/
application.properties
: Application configurationsystem_message.md
: Instructions for the AI model
- Java 21 JDK
- Maven
- Docker (for Ollama profile)
-
Create a
.env.local
file in the project root with your OpenAI API key:API_KEY=your_openai_api_key
-
Choose your AI provider profile:
- OpenAI (default): Uses OpenAI models (requires API key)
- Ollama: Uses local Ollama models (requires Docker)
# Build the project
mvn clean install
# Run with OpenAI profile (default)
mvn spring-boot:run
# Run with Ollama profile
mvn spring-boot:run -Pollama
# Start Ollama container
docker-compose up -d
# Run the application with Ollama profile
mvn spring-boot:run -Pollama
- Access the application at http://localhost:8080
- Use the chat interface to interact with the assistant
- Ask questions about the DevBcn conference
- Request social media posts or newsletter content
- "Tell me about the DevBcn conference"
- "Create a Twitter post announcing speaker John Doe"
- "Generate a Mailchimp newsletter about early bird tickets"
- "What sessions are available in the Java track?"
mvn test
production
: Optimizes the application for productionopenai
: Uses OpenAI models (default)ollama
: Uses Ollama models with Docker Compose integration