Use internal data in your AI tools

    Connect tools like Claude & Cursor to your internal data, with context and governance built-in.

    Try Rig

    Rig in 70 seconds

    You're starting to work insideCoworkCursorCopilotChatGPTClaude CodeWindsurf but how can you give it the right data?

    AI
    SELECT * FROM user_revenue_2024 WHERE region = 'EMEA'
    !

    Table not found: user_revenue_2024

    Did you mean: rev_summary_v3, acct_data_final2?

    AI
    SELECT * FROM rev_summary_v3...
    Tokens used18,432 / 32kStill failing
    Problem 1

    LLMs burn tokens and still fail

    Without schema context, models hallucinate table names, retry endlessly, and exhaust context windows before getting a useful answer.

    SELECT * FROM transactions
    -- no WHERE clause

    Full table scan detected

    2.3B rows · est. $48.20

    user_idamountaccount
    u_9281$4,200••••7823
    u_1047$892••••3310
    u_5530$12,450••••9901
    + 2,299,999,997 more rows...
    Problem 2

    Direct data access is risky

    Raw database access means expensive full-table scans, unfiltered sensitive data, and results with no business context attached.

    Access Requests4 pending
    A
    claude-agent-prodpending3d
    I
    cursor-workspacepending1d
    A
    analytics-botdenied5d
    A
    gpt-integrationpending7d
    H
    jake@acme.coapproved12d
    Manual review required for each request
    Problem 3

    Provisioning is painful

    Setting up data access for every AI agent and internal user individually is slow and painful.

    Contextualised, safe data access with 3 clicks

    Rig generates and maintains context for your data, and lets data teams provision access for humans and agents.

    Rig Connect

    Use cases

    Works for every team that touches data

    10x faster dbt modelling

    Λ Rig Context Layer
    Upstream lineage
    Usage history
    Table grain
    Join strategy
    Semantic metrics
    claude-code, ~/project
    Build a mart model for monthly active users by plan tier
    ClaudeI'll check the warehouse context to understand the source tables and grain...
    RIGget_contextschema=analytics
    RIGget_table_columnsevents, subscriptions
    RIGtest_sqldraft model query
    RIGexecute_sqlvalidate 1,247 rows
    DBTcompilemart_monthly_active_users
    ClaudeModel written to models/marts/mart_monthly_active_users.sql with schema test and docs. Validated against 3 months of data.

    How Rig works

    From warehouse to AI tool in minutes

    01

    Connects to your data sources

    Rig reads and understands the data in your structured data sources. Data never leaves its source.

    Try Rig
    Data Warehouse
    CRM
    ERP
    ΛRig

    02

    Generates your data context layer

    Business definitions, join paths, usage rules, and metric logic, all auto-generated and editable. Context auto-updates as your schema evolves.

    Auto-generated context

    Joins
    fct_orders

    17 cols

    dim_customers

    14 cols

    fct_fraud_alerts

    10 cols

    Semantic Metric

    Fraud Rate

    Usage Rule

    When filtering…

    03

    Edit and add to the context layer

    Data teams can add tribal knowledge and business context that tooling can't infer, like edge cases, exceptions, nuanced definitions, and rules only insiders know.

    claude-code, ~/project
    can you add a rule that 'revenue' yearly always starts Jan 1st.
    RIGcreate_usage_rulefct_transactions
    Done. Added a usage rule to fct_transactions

    04

    Build inside Rig or connect your AI tools

    Automate data-heavy processes in Rig or expose a governed MCP endpoint to Claude, Cursor, Copilot, or any agent, in minutes.

    MCP guide
    Raw JSONSQL QueriesData Flow

    Trigger

    cron

    Agentic Researcher

    24 rows

    Reporter

    1018 words

    Why Rig

    Context and governance that other approaches miss

    Direct database access gives AI raw data with no understanding. Rig gives it context.

    CapabilityWithout RigWith Rig
    Connecting dataManual permissioningOne-click oAuth
    Data contextManual & slow (or AI slop)Accurate and maintained
    Context freshnessGoes stale as schema changesDetects drifts & updates
    Tribal knowledgeWrite & share .md filesEdit in context database via Claude/Cursor
    Token useBig scans & greps, burning tokensContext via MCP, token-efficient
    Governance and auditBuild & maintain manuallyReady out of the box
    IntegrationsManually build & maintainBuilt-in automations to CLIs, IDEs, and SaaS tools

    Ready to use your internal data in your AI tools?