Skip to content

FoodFinder feature: AI-powered food identification for carb entry#1

Closed
taylorpatterson-T1D wants to merge 58 commits intomainfrom
feat/FoodFinder
Closed

FoodFinder feature: AI-powered food identification for carb entry#1
taylorpatterson-T1D wants to merge 58 commits intomainfrom
feat/FoodFinder

Conversation

@taylorpatterson-T1D
Copy link

@taylorpatterson-T1D taylorpatterson-T1D commented Feb 8, 2026

This PR replaces the legacy FoodFinder PR #2329

The Problem We're Solving:

Carb counting is the single hardest daily task for people managing diabetes with Loop. Every meal requires estimating carbohydrate content — and getting it wrong directly impacts Time in Range. Current workflow: the user mentally estimates carbs, types a number, and hopes for the best. There's no assistance, no database lookup, no learning from past meals.

What FoodFinder Does

FoodFinder adds AI-powered food identification directly into Loop's existing Add Carb Entry screen. It provides four ways to identify food and auto-populate carb values:

Text Search

Type a food name (e.g., "banana" or "chicken") into the search bar. FoodFinder queries the USDA Food Data open-source food database and returns matching products with nutrition data. Select a result and carbs, fat, protein, fiber, and calories auto-populate. Serving size adjustments are built in. This is for simple single-word food items like fruits and vegetables.

Voice / Dictation Search

Speak or dictate a food description. FoodFinder detects dictation input automatically using the iPhone's mic option and routes it through AI generative search. Say "I am eating a turkey sandwich with cheese and a side of chips" and the AI analyzes the full meal description, estimates portions, and returns structured nutrition data for you to confirm.

AI Image Analysis

Tap the camera button, take a photo of your meal, and the AI analyzes visible portions using scale references (plate size, utensil dimensions, known object sizes). Returns per-item nutrition breakdown with confidence scoring, USDA-referenced serving sizes, and recommended changes to the absorption time. Supports multi-item plates — each detected food item is listed individually and can be excluded if the user plans to skip an item from the meal.

Menu & Recipe Analysis (with Translation)

Point the camera at a restaurant menu, recipe card, or text listing food items in any language. FoodFinder runs on-device OCR first — if text is detected, it routes through a specialized text analysis path that transcribes, translates, and estimates nutrition using USDA standard serving sizes. Has been tested to work with menus in Spanish, Portuguese, Russian, German, French, etc. Has not been tested with Hanzi, Kanji or Hanja (CJK) symbol languages yet.

User-Configurable Settings

All settings are in Loop Settings → FoodFinder:

Setting Description
FoodFinder Toggle Master on/off for the entire feature
AI Provider BYO API key — supports any image processing OpenAI-compatible endpoint, Anthropic Messages API, or Google Generative AI
API Key Stored in iOS Keychain (encrypted at rest, excluded from backups)
Model Selection User picks their preferred model (e.g., gpt-4o, claude-sonnet-4-5-20250929, gemini-2.0-flash) (Be sure you use models supporting image processing)
Analysis History Retention Last 24 hours, 7 days, 14 days, or 30 days
Advanced Dosing Recommendations Optional AI-generated dosing context (disabled by default)
Advanced API Settings Custom endpoint paths for self-hosted or Azure deployments

Safety Considerations

  • FoodFinder never boluses automatically. It only pre-populates the carb entry field — the user always reviews, edits, and confirms before any insulin action occurs.
  • Confidence scoring is displayed for every AI analysis (20–97% range). Low-confidence results are visually flagged.
  • The user can edit all values. AI-suggested carbs, absorption time, and serving count are starting points, not final answers.
  • API keys are stored in iOS Keychain — encrypted at rest, never in UserDefaults or plain text.
  • Feature toggle allows complete disable without removing code.
  • No data leaves the device except the API call to the user's own configured AI provider. No analytics, no telemetry, no third-party tracking.

Impact on Existing Loop Code

FoodFinder was designed for minimal integration footprint and easy containment within Loop:

  • 30 new files (all prefixed FoodFinder_, in dedicated subdirectories)
  • ~15,000 lines of new code across Models, Views, View Models, Services, Resources, and Tests
  • Only 7 existing files modified with a combined ~300 lines changed — mostly additive
  • Zero changes to LoopKit or any other submodules

Modified existing files:

File What changed
CarbEntryView.swift Embeds FoodFinder_EntryPoint (~5 lines) + analysis history picker
SettingsView.swift Adds FoodFinder settings row + navigation link
CarbEntryViewModel.swift Adds analysis history state + restored result bindings
FavoriteFoodDetailView.swift Adds "Analyze with AI" button for saved favorites
FavoriteFoodsView.swift Adds FoodFinder thumbnail support on favorite food rows
AddEditFavoriteFoodView.swift Accepts pre-populated name/image from FoodFinder
AddEditFavoriteFoodViewModel.swift Passes through thumbnail image binding

New file locations (all under Loop/):

Directory Files Purpose
Models/FoodFinder/ 3 Data models, analysis records, input result types
View Models/FoodFinder/ 1 Search state, AI analysis, product selection logic
Views/FoodFinder/ 8 All FoodFinder UI (entry point, camera, scanner, settings, search, etc.)
Services/FoodFinder/ 12 AI analysis, API clients, barcode scanning, voice, image storage, routing
Resources/FoodFinder/ 1 Feature flags and configuration
Documentation/FoodFinder/ 1 README
LoopTests/FoodFinder/ 3 Unit tests for barcode, OpenFoodFacts, and voice search

Screenshots

  1. FoodFinder search bar with text search results
IMG_1590
  1. Barcode scanner camera capture
IMG_1591
  1. AI camera view analyzing a plate of food
IMG_1592
  1. Product info card with nutrition circles and confidence badge
IMG_1593
  1. Menu/recipe analysis showing translated items
IMG_1594
  1. FoodFinder settings page with AI provider configuration
IMG_1595 IMG_1596
  1. Analysis history picker showing recent meals
IMG_1597
  1. Favorite food with FoodFinder thumbnail
IMG_1598

Video Demo

YouTube Demo: https://youtu.be/i8xToAYBe4M

Test Plan

For reviewers and field testers:

  • Toggle: Enable/disable FoodFinder in Settings and confirm the search bar appears/disappears in Add Carb Entry
  • Text search: Search "banana", select a result, verify carbs auto-populate and serving stepper works
  • Barcode scan: Scan a packaged food barcode, verify product lookup and nutrition display
  • AI image analysis: Photograph a meal, verify food items are detected with carbs and confidence score
  • Menu analysis: Photograph a restaurant menu or recipe, verify OCR detection routes to text analysis and items are translated if needed
  • Voice/dictation: Dictate a meal description, verify AI generates structured nutrition response
  • Analysis history: After several analyses, verify "Recent AI Analyses" picker appears below Continue button and re-selecting a past analysis restores its values
  • Favorites integration: Save an AI-analyzed food as a favorite, verify name and thumbnail carry through
  • Settings persistence: Configure an AI provider, close and reopen settings, verify API key and model persist
  • Long titles: Analyze an image that returns a very long food name — verify no horizontal overflow in the UI
  • Bolus flow: After FoodFinder populates carbs, tap Continue and verify the bolus calculator receives the correct carb value

Asking @marionbarker for review upon availability.

marionbarker and others added 30 commits August 24, 2025 17:01
Updated translations from Lokalise on Sun Aug 24 12:32:21 PDT 2025
Updated translations from Lokalise on Sat Aug 30 10:22:12 PDT 2025
Bolus view fixes, and updates for iOS26
add missing localization strings for Favorite Foods,
add comments to Favorite Foods string that were already localized,
remove some items that do not require localization
Clear bolus recommendation on initial edit
Support audio for pump managers that use silent audio for keep-alive
Updated translations from lokalise on Tue Sep 23 15:51:19 PDT 2025
marionbarker and others added 20 commits October 24, 2025 13:16
Updated translations from lokalise on Fri Oct 24 11:10:09 PDT 2025
Updated translations from lokalise on Wed Nov 19 09:07:32 PST 2025
Updated translations from lokalise on Sat Dec 27 14:50:21 PST 2025
Updated translations from lokalise on Sun Feb  1 09:46:29 PST 2026
* improve large font display;
* add section headings;
* use short labels for display, long labels for description
* Enable autoscaling in Live Activity widget to limit truncation
* bug fix for plot using glucose color; author: bastiaanv
FoodFinder adds barcode scanning, AI camera analysis, voice search, and
text-based food lookup to Loop's carb entry workflow. All feature code
lives in dedicated FoodFinder/ subdirectories with FoodFinder_ prefixed
filenames for clean isolation and portability to other Loop forks.

Integration touchpoints: ~29 lines across 3 existing files
(CarbEntryView, SettingsView, FavoriteFoodDetailView). Feature is
controlled by a single toggle in FoodFinder_FeatureFlags.swift.

New files: 34 (11 views, 3 models, 13 services, 2 view models,
1 feature flags, 1 documentation, 3 tests)
Voice search (microphone button) now uses the AI analysis pipeline
instead of USDA text search, enabling natural language food descriptions
like "a medium bowl of spicy ramen and a side of gyoza". Text-typed
searches continue using USDA/OpenFoodFacts as before.

Changes:
- SearchBar: Add mic button with voice search callback
- SearchRouter: Add analyzeFoodByDescription() routing through AI providers
- SearchViewModel: Add performVoiceSearch() async method
- EntryPoint: Wire VoiceSearchView sheet to AI analysis pipeline
Replace the separate mic button with automatic natural language detection.
When the user dictates into the search field via iOS keyboard dictation,
the text is analyzed: short queries (1-3 words like "apple") use USDA,
while longer descriptive phrases (4+ words like "a medium bowl of spicy
ramen and a side of gyoza") automatically route to the AI analysis path.

Changes:
- SearchBar: Remove mic button and onVoiceSearchTapped parameter
- SearchViewModel: Add isNaturalLanguageQuery() heuristic, route detected
  natural language through performVoiceSearch in performFoodSearch
- EntryPoint: Remove voice search sheet, wire onGenerativeSearchResult
  callback to handleAIFoodAnalysis
The Python script created group definitions but didn't properly attach
all of them to their parent groups. Fixes:
- Services group → now child of Loop app root (was orphaned)
- Resources group → now child of Loop app root (was orphaned)
- Documentation group → now child of project root (was orphaned)
- ViewModels/FoodFinder → moved from Loop root to View Models group
- Tests/FoodFinder → moved from project root to LoopTests group
…, analysis history

- Fix triple barcode fire by consuming scan result immediately in Combine sink
- Replace AsyncImage with pre-downloaded thumbnail to avoid SwiftUI rebuild issues
- Use smallest OFF thumbnail (100px) with static food icon fallback for slow servers
- Add secure Keychain storage for AI provider API keys
- Add analysis history tracking with FoodFinder_AnalysisRecord
- Consolidate AI provider settings and remove BYOTestConfig
- Remove barcode connectivity pre-check that added 3+ seconds latency per scan
- Add NSCache to ImageDownloader for thumbnail deduplication (50 items, 10MB)
- Remove artificial minimumSearchDuration delay from search and error paths
- Merge duplicate Combine observers into single combineLatest for AI recomputation
- Decode image_thumb_url from OpenFoodFacts API for smallest available thumbnail
- Wrap 369 bare print() calls in #if DEBUG across 8 FoodFinder files
…eaders

File consolidations (6 files removed, 2 new files created):

1. FoodFinder_ScanResult.swift + FoodFinder_VoiceResult.swift
   → FoodFinder_InputResults.swift

2. FoodFinder_FavoriteDetailView.swift + FoodFinder_FavoriteEditView.swift
   + FoodFinder_FavoritesView.swift → FoodFinder_FavoritesHelpers.swift

3. FoodFinder_AISettingsManager.swift
   → absorbed into FoodFinder_AIProviderConfig.swift

4. FoodFinder_FavoritesViewModel.swift
   → absorbed into FoodFinder_SearchViewModel.swift

Other changes:
- Fix long analysis titles overflowing the screen by programmatically
  truncating picker row names and constraining food type to 20 chars
- Improve AI prompts for menu/recipe/text image analysis
- Add text-only AI analysis path in AIServiceManager
- Increase AI token budget for multi-item responses
- Standardize all 26 FoodFinder file headers with consistent format
- Add originalAICarbs and aiConfidencePercent fields to
  FoodFinder_AnalysisRecord for tracking AI estimate accuracy
- Add Notification.Name.foodFinderMealLogged for real-time
  meal event observation
- Add MealDataProvider protocol with date-range query interface
  and AnalysisHistoryStore conformance
- Add "Last 30 days" retention option to Analysis History settings
@taylorpatterson-T1D taylorpatterson-T1D changed the title Add FoodFinder: AI-powered food identification for carb entry FoodFinder: AI-powered food identification for carb entry Feb 9, 2026
- Add originalAICarbs and aiConfidencePercent fields to
  FoodFinder_AnalysisRecord for tracking AI estimate accuracy
- Add Notification.Name.foodFinderMealLogged for real-time
  meal event observation
- Add MealDataProvider protocol with date-range query interface
  and AnalysisHistoryStore conformance
- Add "Last 30 days" retention option to Analysis History settings
@taylorpatterson-T1D taylorpatterson-T1D changed the title FoodFinder: AI-powered food identification for carb entry Add FoodFinder feature: AI-powered food identification for carb entry Feb 9, 2026
@taylorpatterson-T1D taylorpatterson-T1D changed the title Add FoodFinder feature: AI-powered food identification for carb entry FoodFinder feature: AI-powered food identification for carb entry Feb 9, 2026
taylorpatterson-T1D and others added 3 commits February 9, 2026 16:58
- Absorption time model: conservative adjustments anchored to Loop's
  3-hour default. FPU adds +0/+0.5/+1.0 hr (was +1/+2.5/+4), fiber
  +0/+0.25/+0.5 (was +0/+1/+2), meal size +0/+0.25/+0.5 (was +0/+1/+2).
  Cap reduced from 8 to 5 hours. Updated AI prompt and 3 examples.
- OCR routing fix: raised menu detection threshold from 1 to 5 significant
  lines and always include image on menu path to prevent food photo
  misclassification (fixes "Unidentifiable Food Item" on food photos).
- Inline "Why X hrs?" pill on Absorption Time row replaces standalone
  DisclosureGroup row. Purple centered pill with fixed width, expands
  reasoning on tap. Uses AIAbsorptionTimePickerRow when AI-generated.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@taylorpatterson-T1D taylorpatterson-T1D deleted the feat/FoodFinder branch February 11, 2026 17:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants