Skip to main content

AI Architecture Diagram

Overview of AI Integration in KillIT v3

AI Data Flow

AI Use Case Flow

1. Semantic Search Flow

User Query → Embedding Generation → Vector Search → Result Ranking → AI Enhancement → Response

2. Relationship Discovery Flow

Scan Data → Network Analysis → Pattern Recognition → Claude Analysis → Relationship Creation → Confidence Scoring

3. Insight Generation Flow

CI Selection → Data Aggregation → Context Building → Claude Processing → Insight Storage → UI Display

4. Software Classification Flow

Software Discovery → Name Normalization → AI Classification → Family Mapping → CPE Assignment → Hierarchy Building

Key AI Components

Core Services

  1. Claude Service: Central AI integration with AWS Bedrock
  2. Embedding Service: Vector generation for semantic search
  3. RAG Service: Retrieval-augmented generation for contextual responses
  4. AI Analytics: Usage tracking and optimization

AI Models Used

  • Claude 4.5 Sonnet: Primary model for complex analysis
  • Claude 4.5 Haiku: Fast model for simple tasks
  • Claude 3.5 Sonnet: Fallback model
  • AWS Titan Embeddings: Vector generation
  • OpenAI GPT: Holiday generation (fallback)

Data Storage

  • MongoDB: Stores AI-generated content and embeddings
  • Redis: Caches embeddings and frequently used data
  • Vector Indexes: Enables semantic search capabilities

Performance Optimizations

  1. Caching Strategy

    • Embedding cache (24-hour TTL)
    • Response cache for common queries
    • Batch processing for bulk operations
  2. Rate Limiting

    • 30 requests/minute per service
    • Token bucket algorithm
    • Automatic retry with backoff
  3. Cost Optimization

    • Model selection based on task complexity
    • Token usage monitoring
    • Batch processing where possible