|
| 1 | +# LLM Integration: Smart Context Requesting |
| 2 | + |
| 3 | +This guide explains how to leverage inline configuration—a method of providing JSON-based configuration directly in |
| 4 | +command-line commands rather than separate configuration files—with the tool. This approach creates an |
| 5 | +efficient workflow between LLMs and your codebase for context-gathering and problem-solving, enabling on-the-fly context |
| 6 | +requests without the need for predefined configuration files. |
| 7 | + |
| 8 | +## Getting Started |
| 9 | + |
| 10 | +Before diving into advanced usage, you need to understand: |
| 11 | + |
| 12 | +1. **JSON Schema**: The Context Generator uses a specific JSON structure to define what context to gather |
| 13 | + |
| 14 | +> Link to the [JSON Schema](https://raw.githubusercontent.com/context-hub/generator/refs/heads/main/json-schema.json) |
| 15 | +
|
| 16 | +2. **Project Structure**: Understanding your codebase organization helps in creating effective context requests |
| 17 | + |
| 18 | +> **Note**: There is [Tree source](/sources/tree-source) that is designed to generate project structure |
| 19 | +
|
| 20 | +3. **Request Instructions**: Learning how to formulate context requests properly |
| 21 | + |
| 22 | +## Understanding the LLM-Context workflow |
| 23 | + |
| 24 | +The inline configuration feature enables a powerful workflow where LLMs can request precisely the context they need |
| 25 | +through executable commands. This creates a feedback loop that significantly enhances the quality of assistance: |
| 26 | + |
| 27 | +> **Note**: Read more about inline configuration in |
| 28 | +> the [Command Reference](/getting-started/command-reference#inline-configuration) section. |
| 29 | +
|
| 30 | +1. **User asks a question** about their codebase |
| 31 | +2. **LLM generates a context request** as an executable command |
| 32 | +3. **User runs the command** to gather relevant context |
| 33 | +4. **LLM receives the context** and provides a more informed answer |
| 34 | + |
| 35 | +## Real-World Example Workflow |
| 36 | + |
| 37 | +Here's a complete workflow example showing how an LLM and the Context Generator can work together: |
| 38 | + |
| 39 | +1. **User Question**: |
| 40 | + "I'm getting an error in our payment processing system when handling refunds. The error says 'Invalid transaction |
| 41 | + state'. How can I fix this?" |
| 42 | + |
| 43 | +2. **LLM Initial Response**: |
| 44 | + "I'll help troubleshoot this issue. To understand what's happening, I need to see the payment processing code, |
| 45 | + especially the parts related to refunds and transaction state management. Could you run this command to gather the |
| 46 | + relevant context?" |
| 47 | + |
| 48 | +```bash |
| 49 | +ctx --config='{ |
| 50 | + "documents": [{ |
| 51 | + "description": "Payment Refund Processing", |
| 52 | + "outputPath": "payment-refund.md", |
| 53 | + "sources": [{ |
| 54 | + "type": "file", |
| 55 | + "sourcePaths": ["src"], |
| 56 | + "filePattern": "*.php", |
| 57 | + "contains": "refund|transaction.*state|payment.*process" |
| 58 | + }] |
| 59 | + }] |
| 60 | +}' |
| 61 | +``` |
| 62 | + |
| 63 | +3. **User Executes Command** and shares the generated `.context/payment-refund.md` file. |
| 64 | + |
| 65 | +4. **LLM Analysis**: |
| 66 | + "Based on the code you shared, I can see the issue. In the `RefundService.php` file, line 142, your code checks if |
| 67 | + the transaction is in the 'SETTLED' state, but it doesn't check for the 'PARTIALLY_SETTLED' state, which can also be |
| 68 | + valid for refunds..." |
| 69 | + |
| 70 | +5. **LLM Solution**: |
| 71 | + "Here's how to fix the issue: modify the `validateRefundTransaction` method to also accept 'PARTIALLY_SETTLED' as a |
| 72 | + valid state for refunds..." |
| 73 | + |
| 74 | +## Best Practices |
| 75 | + |
| 76 | +### Optimizing Context Requests |
| 77 | + |
| 78 | +1. **Be specific with your searches:** |
| 79 | + - Use precise file paths when you know the location |
| 80 | + - Combine `filePattern` with `contains` to narrow down results |
| 81 | + - Use regex patterns for more complex search needs |
| 82 | + |
| 83 | +2. **Manage output size:** |
| 84 | + - Apply content modifiers to remove unnecessary details |
| 85 | + - Split large context needs into multiple focused requests |
| 86 | + - Use content filters to extract only relevant code sections |
| 87 | + |
| 88 | +3. **Structure for readability:** |
| 89 | + - Use meaningful descriptions for each document |
| 90 | + - Group related files in the same document |
| 91 | + - Add text sources to provide explanatory notes |
| 92 | + |
| 93 | +### Security Considerations |
| 94 | + |
| 95 | +When working with sensitive codebases: |
| 96 | + |
| 97 | +1. **Limit scope:** Only include necessary files and avoid exposing sensitive information |
| 98 | +2. **Verify outputs:** Review generated context files before sharing with external LLMs |
| 99 | +3. **Consider isolation:** For highly sensitive projects, consider running LLMs locally |
| 100 | + |
| 101 | +## Conclusion |
| 102 | + |
| 103 | +The combination of Context Generator's inline configuration and LLMs creates a powerful workflow that drastically |
| 104 | +improves the efficiency of code assistance, troubleshooting, and development guidance. By enabling LLMs to request |
| 105 | +precise context, you eliminate the need for lengthy explanations and manual code searching, leading to faster and more |
| 106 | +accurate solutions. |
0 commit comments