Snowflake Capabilities

Naftiko 0.5 capability definitions for Snowflake - 100 capabilities showing integration workflows and service orchestrations.

Sort
Expand

Audits Snowflake role grants for a list of users, identifies excessive privileges, and creates a ServiceNow access review task for the security team.

naftiko: "0.5"
info:
  label: "Access Review and ServiceNow Ticket Generator"
  description: "Audits Snowflake role grants for a list of users, identifies excessive privileges, and creates a ServiceNow access review task for the security team."
  tags:
    - security
    - compliance
    - snowflake
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: access-review
      port: 8080
      tools:
        - name: run-access-review
          description: "Audit Snowflake role assignments and create a ServiceNow task for any users with excessive privileges."
          inputParameters:
            - name: role_name
              in: body
              type: string
              description: "The Snowflake role to audit (e.g., ACCOUNTADMIN, SYSADMIN)."
            - name: assignment_group
              in: body
              type: string
              description: "ServiceNow assignment group for the review task."
          steps:
            - name: get-role-grants
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SHOW GRANTS OF ROLE {{role_name}}"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: create-review-task
              type: call
              call: "servicenow.create-task"
              with:
                short_description: "Access Review: {{role_name}} role in Snowflake"
                description: "The role {{role_name}} is currently granted to {{get-role-grants.data.length}} principals. Please review each assignment and revoke any that are no longer justified."
                assignment_group: "{{assignment_group}}"
                category: "security_review"
                priority: "2"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://{{snow_instance}}.service-now.com/api/now"
      authentication:
        type: basic
        username: "$secrets.servicenow_user"
        password: "$secrets.servicenow_password"
      resources:
        - name: tasks
          path: "/table/sc_task"
          operations:
            - name: create-task
              method: POST

Retrieves alert condition status data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Alert Condition Status"
  description: "Retrieves alert condition status data from the Snowflake cloud data platform systems."
  tags:
    - alert
    - snowflake
    - status
capability:
  exposes:
    - type: mcp
      namespace: alert
      port: 8080
      tools:
        - name: alert-condition-status
          description: "Retrieves alert condition status data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.alert-condition-status"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/alert/condition/status/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: alert-condition-status
              method: GET

Retrieves api integration config data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Api Integration Config"
  description: "Retrieves api integration config data from the Snowflake cloud data platform systems."
  tags:
    - api
    - snowflake
    - config
capability:
  exposes:
    - type: mcp
      namespace: api
      port: 8080
      tools:
        - name: api-integration-config
          description: "Retrieves api integration config data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.api-integration-config"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/api/integration/config/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: api-integration-config
              method: GET

Discovers untagged columns in a Snowflake schema using Cortex data classification, applies recommended sensitivity tags, and reports results to a Slack governance channel.

naftiko: "0.5"
info:
  label: "Automated Snowflake Tag Propagation"
  description: "Discovers untagged columns in a Snowflake schema using Cortex data classification, applies recommended sensitivity tags, and reports results to a Slack governance channel."
  tags:
    - data-governance
    - classification
    - cortex
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: auto-tagging
      port: 8080
      tools:
        - name: auto-classify-columns
          description: "Run Cortex classification on untagged columns and apply sensitivity tags."
          inputParameters:
            - name: database
              in: body
              type: string
              description: "Snowflake database to scan."
            - name: schema
              in: body
              type: string
              description: "Schema to scan for untagged columns."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for classification reports."
          steps:
            - name: run-classification
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT EXTRACT_SEMANTIC_CATEGORIES('{{database}}.{{schema}}')"
                warehouse: "ML_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: apply-tags
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CALL ASSOCIATE_SEMANTIC_CATEGORY_TAGS('{{database}}.{{schema}}', {{run-classification.data[0][0]}})"
                warehouse: "ML_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: report-results
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Auto-Classification Complete: Schema {{database}}.{{schema}} scanned. Sensitivity tags applied via Cortex classification. Review tag assignments in Snowflake."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

When an employee is terminated in Workday, disables their Snowflake user account, revokes all role grants, and creates a ServiceNow closure ticket.

naftiko: "0.5"
info:
  label: "Automated User Deprovisioning"
  description: "When an employee is terminated in Workday, disables their Snowflake user account, revokes all role grants, and creates a ServiceNow closure ticket."
  tags:
    - security
    - governance
    - snowflake
    - workday
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: user-deprovisioning
      port: 8080
      tools:
        - name: deprovision-user
          description: "Disable a Snowflake user, revoke roles, and log a ServiceNow ticket upon Workday termination."
          inputParameters:
            - name: employee_id
              in: body
              type: string
              description: "Workday employee ID of the terminated user."
            - name: snowflake_username
              in: body
              type: string
              description: "Snowflake username to disable."
          steps:
            - name: get-employee
              type: call
              call: "workday.get-worker"
              with:
                worker_id: "{{employee_id}}"
            - name: disable-user
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER USER {{snowflake_username}} SET DISABLED = TRUE"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: revoke-roles
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "REVOKE ALL PRIVILEGES ON ALL SCHEMAS IN DATABASE ANALYTICS FROM USER {{snowflake_username}}"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: create-closure-ticket
              type: call
              call: "servicenow.create-incident"
              with:
                short_description: "Snowflake access revoked: {{get-employee.full_name}}"
                description: "User {{snowflake_username}} ({{get-employee.full_name}}) has been disabled in Snowflake due to termination. All roles revoked."
                category: "security"
                assigned_group: "IAM_Team"
  consumes:
    - type: http
      namespace: workday
      baseUri: "https://wd2-impl-services1.workday.com/ccx/api/v1"
      authentication:
        type: bearer
        token: "$secrets.workday_token"
      resources:
        - name: workers
          path: "/workers/{{worker_id}}"
          inputParameters:
            - name: worker_id
              in: path
          operations:
            - name: get-worker
              method: GET
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://{{snow_instance}}.service-now.com/api/now"
      authentication:
        type: basic
        username: "$secrets.servicenow_user"
        password: "$secrets.servicenow_password"
      resources:
        - name: incidents
          path: "/table/incident"
          operations:
            - name: create-incident
              method: POST

After Snowflake transformations complete, triggers a Tableau extract refresh and notifies the analytics team in Microsoft Teams that fresh data is available.

naftiko: "0.5"
info:
  label: "BI Dashboard Refresh Trigger"
  description: "After Snowflake transformations complete, triggers a Tableau extract refresh and notifies the analytics team in Microsoft Teams that fresh data is available."
  tags:
    - analytics
    - bi
    - snowflake
    - tableau
    - microsoft-teams
capability:
  exposes:
    - type: mcp
      namespace: bi-refresh
      port: 8080
      tools:
        - name: refresh-tableau-extract
          description: "Trigger a Tableau extract refresh after Snowflake data updates and notify the team."
          inputParameters:
            - name: datasource_id
              in: body
              type: string
              description: "Tableau datasource ID to refresh."
            - name: site_id
              in: body
              type: string
              description: "Tableau site ID."
            - name: teams_webhook
              in: body
              type: string
              description: "Microsoft Teams webhook URL for notifications."
          steps:
            - name: trigger-refresh
              type: call
              call: "tableau.refresh-datasource"
              with:
                site_id: "{{site_id}}"
                datasource_id: "{{datasource_id}}"
            - name: notify-teams
              type: call
              call: "msteams.post-webhook"
              with:
                webhook_url: "{{teams_webhook}}"
                text: "Tableau datasource {{datasource_id}} refresh triggered. Job ID: {{trigger-refresh.job.id}}. Fresh Snowflake data will be available shortly."
  consumes:
    - type: http
      namespace: tableau
      baseUri: "https://{{tableau_server}}/api/3.21/sites/{{site_id}}"
      authentication:
        type: bearer
        token: "$secrets.tableau_token"
      inputParameters:
        - name: site_id
          in: path
      resources:
        - name: datasources
          path: "/datasources/{{datasource_id}}/refresh"
          inputParameters:
            - name: datasource_id
              in: path
          operations:
            - name: refresh-datasource
              method: POST
    - type: http
      namespace: msteams
      baseUri: "https://outlook.office.com/webhook"
      authentication:
        type: none
      resources:
        - name: webhook
          path: "/{{webhook_url}}"
          inputParameters:
            - name: webhook_url
              in: path
          operations:
            - name: post-webhook
              method: POST

Orchestrates cortex ai function deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Cortex Ai Function Deployment Pipeline"
  description: "Orchestrates cortex ai function deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - cortex
    - snowflake
    - slack
    - jira
    - datadog
capability:
  exposes:
    - type: mcp
      namespace: cortex
      port: 8080
      tools:
        - name: cortex-ai-function-deployment-pipeline
          description: "Orchestrates cortex ai function deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "slack.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "jira.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "datadog.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/cortex"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/cortex"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/cortex"
          operations:
            - name: execute-3
              method: POST

Uses Snowflake Cortex SUMMARIZE function to generate text summaries from a document table and stores results in a summary output table.

naftiko: "0.5"
info:
  label: "Cortex LLM Text Summarizer"
  description: "Uses Snowflake Cortex SUMMARIZE function to generate text summaries from a document table and stores results in a summary output table."
  tags:
    - machine-learning
    - cortex
    - nlp
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: cortex-summarize
      port: 8080
      tools:
        - name: summarize-documents
          description: "Run Snowflake Cortex SUMMARIZE on a text column and store summaries."
          inputParameters:
            - name: source_table
              in: body
              type: string
              description: "Fully qualified source table with documents."
            - name: text_column
              in: body
              type: string
              description: "Column containing text to summarize."
            - name: output_table
              in: body
              type: string
              description: "Fully qualified output table for summaries."
            - name: id_column
              in: body
              type: string
              description: "Primary key column for the source table."
          steps:
            - name: run-summarize
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO {{output_table}} ({{id_column}}, SUMMARY, GENERATED_AT) SELECT {{id_column}}, SNOWFLAKE.CORTEX.SUMMARIZE({{text_column}}), CURRENT_TIMESTAMP() FROM {{source_table}} WHERE {{id_column}} NOT IN (SELECT {{id_column}} FROM {{output_table}})"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "PUBLIC"
            - name: count-summaries
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT COUNT(*) AS NEW_SUMMARIES FROM {{output_table}} WHERE GENERATED_AT >= DATEADD(minutes, -5, CURRENT_TIMESTAMP())"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "PUBLIC"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Invokes a Snowflake Cortex ML forecast function on a time-series table and returns predicted values. Used by analytics teams for demand or revenue forecasting.

naftiko: "0.5"
info:
  label: "Cortex ML Forecast Runner"
  description: "Invokes a Snowflake Cortex ML forecast function on a time-series table and returns predicted values. Used by analytics teams for demand or revenue forecasting."
  tags:
    - machine-learning
    - cortex
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-cortex
      port: 8080
      tools:
        - name: run-forecast
          description: "Execute a Snowflake Cortex ML forecast on a time-series dataset and return predictions."
          inputParameters:
            - name: model_name
              in: body
              type: string
              description: "The name of the trained forecast model."
            - name: forecast_horizon
              in: body
              type: integer
              description: "Number of periods to forecast ahead."
          call: "snowflake.submit-statement"
          with:
            statement: "CALL {{model_name}}!FORECAST(FORECASTING_PERIODS => {{forecast_horizon}})"
            warehouse: "ML_WH"
            database: "ANALYTICS"
            schema: "ML_MODELS"
          outputParameters:
            - name: predictions
              type: array
              mapping: "$.data"
            - name: query_id
              type: string
              mapping: "$.statementHandle"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Queries a Snowflake Cortex Search service to find relevant documents or records using semantic search, and returns ranked results with relevance scores.

naftiko: "0.5"
info:
  label: "Cortex Search Service Query"
  description: "Queries a Snowflake Cortex Search service to find relevant documents or records using semantic search, and returns ranked results with relevance scores."
  tags:
    - machine-learning
    - cortex
    - search
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: cortex-search
      port: 8080
      tools:
        - name: semantic-search
          description: "Run a semantic search query against a Snowflake Cortex Search service."
          inputParameters:
            - name: search_service
              in: body
              type: string
              description: "Fully qualified Cortex Search service name."
            - name: query_text
              in: body
              type: string
              description: "Natural language query for semantic search."
            - name: max_results
              in: body
              type: integer
              description: "Maximum number of results to return."
          steps:
            - name: run-search
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT * FROM TABLE({{search_service}}!SEARCH(QUERY => '{{query_text}}', LIMIT => {{max_results}}))"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "SEARCH"
            - name: log-query
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO ANALYTICS.SEARCH.QUERY_LOG (QUERY_TEXT, SERVICE_NAME, RESULT_COUNT, QUERIED_AT) VALUES ('{{query_text}}', '{{search_service}}', {{run-search.data.length}}, CURRENT_TIMESTAMP())"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "SEARCH"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Runs Snowflake Cortex SENTIMENT function on a text column and returns sentiment scores. Used by customer analytics teams to gauge feedback tone.

naftiko: "0.5"
info:
  label: "Cortex Sentiment Analyzer"
  description: "Runs Snowflake Cortex SENTIMENT function on a text column and returns sentiment scores. Used by customer analytics teams to gauge feedback tone."
  tags:
    - machine-learning
    - cortex
    - nlp
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-cortex-nlp
      port: 8080
      tools:
        - name: analyze-sentiment
          description: "Run Cortex SENTIMENT analysis on a text column in a Snowflake table."
          inputParameters:
            - name: source_table
              in: body
              type: string
              description: "Fully qualified table name containing text data."
            - name: text_column
              in: body
              type: string
              description: "The column name containing text to analyze."
            - name: limit
              in: body
              type: integer
              description: "Maximum number of rows to analyze."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT {{text_column}}, SNOWFLAKE.CORTEX.SENTIMENT({{text_column}}) AS sentiment_score FROM {{source_table}} LIMIT {{limit}}"
            warehouse: "ML_WH"
            database: "ANALYTICS"
            schema: "PUBLIC"
          outputParameters:
            - name: results
              type: array
              mapping: "$.data"
            - name: query_id
              type: string
              mapping: "$.statementHandle"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Queries Snowflake account usage views to report warehouse credit consumption for a given date range. Used by FinOps teams to monitor cloud spend.

naftiko: "0.5"
info:
  label: "Credit Usage Reporter"
  description: "Queries Snowflake account usage views to report warehouse credit consumption for a given date range. Used by FinOps teams to monitor cloud spend."
  tags:
    - platform
    - cost-management
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-finops
      port: 8080
      tools:
        - name: get-credit-usage
          description: "Retrieve warehouse credit consumption from Snowflake account usage for a specified date range."
          inputParameters:
            - name: start_date
              in: body
              type: string
              description: "Start date for the credit usage query in YYYY-MM-DD format."
            - name: end_date
              in: body
              type: string
              description: "End date for the credit usage query in YYYY-MM-DD format."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT WAREHOUSE_NAME, SUM(CREDITS_USED) AS TOTAL_CREDITS FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY WHERE START_TIME >= '{{start_date}}' AND START_TIME < '{{end_date}}' GROUP BY WAREHOUSE_NAME ORDER BY TOTAL_CREDITS DESC"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: credit_data
              type: array
              mapping: "$.data"
            - name: query_id
              type: string
              mapping: "$.statementHandle"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Creates a new outbound data share in Snowflake, grants select on specified objects, adds a consumer account, and notifies the data product owner via Slack.

naftiko: "0.5"
info:
  label: "Cross-Account Data Share Provisioner"
  description: "Creates a new outbound data share in Snowflake, grants select on specified objects, adds a consumer account, and notifies the data product owner via Slack."
  tags:
    - data-sharing
    - governance
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: data-sharing
      port: 8080
      tools:
        - name: provision-data-share
          description: "Create an outbound Snowflake data share, add objects, register a consumer, and notify via Slack."
          inputParameters:
            - name: share_name
              in: body
              type: string
              description: "Name for the new data share."
            - name: database
              in: body
              type: string
              description: "Database containing the objects to share."
            - name: schema
              in: body
              type: string
              description: "Schema containing the objects to share."
            - name: table_name
              in: body
              type: string
              description: "Table to add to the share."
            - name: consumer_account
              in: body
              type: string
              description: "Snowflake account identifier of the consumer."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for share notifications."
          steps:
            - name: create-share
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE SHARE IF NOT EXISTS {{share_name}}"
                warehouse: "ADMIN_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: grant-usage
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "GRANT USAGE ON DATABASE {{database}} TO SHARE {{share_name}}; GRANT USAGE ON SCHEMA {{database}}.{{schema}} TO SHARE {{share_name}}; GRANT SELECT ON TABLE {{database}}.{{schema}}.{{table_name}} TO SHARE {{share_name}}"
                warehouse: "ADMIN_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: add-consumer
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER SHARE {{share_name}} ADD ACCOUNTS = {{consumer_account}}"
                warehouse: "ADMIN_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: notify-owner
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Data Share '{{share_name}}' provisioned. Table {{database}}.{{schema}}.{{table_name}} shared with account {{consumer_account}}."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Orchestrates cross cloud replication pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Cross Cloud Replication Pipeline"
  description: "Orchestrates cross cloud replication pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - cross
    - snowflake
    - datadog
    - github
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: cross
      port: 8080
      tools:
        - name: cross-cloud-replication-pipeline
          description: "Orchestrates cross cloud replication pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "datadog.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "github.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "confluence.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/cross"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/cross"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/cross"
          operations:
            - name: execute-3
              method: POST

Orchestrates data catalog metadata pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Catalog Metadata Pipeline"
  description: "Orchestrates data catalog metadata pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - jira
    - datadog
    - github
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-catalog-metadata-pipeline
          description: "Orchestrates data catalog metadata pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "jira.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "datadog.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "github.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Extracts table and column metadata from Snowflake and registers it in an Alation data catalog entry, ensuring discoverability for analysts.

naftiko: "0.5"
info:
  label: "Data Catalog Registration"
  description: "Extracts table and column metadata from Snowflake and registers it in an Alation data catalog entry, ensuring discoverability for analysts."
  tags:
    - data-governance
    - catalog
    - snowflake
    - alation
capability:
  exposes:
    - type: mcp
      namespace: catalog-registration
      port: 8080
      tools:
        - name: register-in-catalog
          description: "Extract Snowflake table metadata and register in Alation data catalog."
          inputParameters:
            - name: database
              in: body
              type: string
              description: "Snowflake database containing the table."
            - name: schema
              in: body
              type: string
              description: "Schema containing the table."
            - name: table_name
              in: body
              type: string
              description: "Table to register in the catalog."
            - name: ds_id
              in: body
              type: integer
              description: "Alation datasource ID for the Snowflake connection."
          steps:
            - name: get-metadata
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT COLUMN_NAME, DATA_TYPE, IS_NULLABLE, COMMENT FROM {{database}}.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_SCHEMA = '{{schema}}' AND TABLE_NAME = '{{table_name}}' ORDER BY ORDINAL_POSITION"
                warehouse: "COMPUTE_WH"
                database: "{{database}}"
                schema: "INFORMATION_SCHEMA"
            - name: get-table-comment
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT COMMENT FROM {{database}}.INFORMATION_SCHEMA.TABLES WHERE TABLE_SCHEMA = '{{schema}}' AND TABLE_NAME = '{{table_name}}'"
                warehouse: "COMPUTE_WH"
                database: "{{database}}"
                schema: "INFORMATION_SCHEMA"
            - name: register-table
              type: call
              call: "alation.update-table"
              with:
                ds_id: "{{ds_id}}"
                schema_name: "{{schema}}"
                table_name: "{{table_name}}"
                description: "{{get-table-comment.data[0][0]}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: alation
      baseUri: "https://{{alation_host}}/integration/v2"
      authentication:
        type: bearer
        token: "$secrets.alation_token"
      resources:
        - name: tables
          path: "/table"
          operations:
            - name: update-table
              method: POST

Orchestrates data clean room setup pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Clean Room Setup Pipeline"
  description: "Orchestrates data clean room setup pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - salesforce
    - slack
    - jira
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-clean-room-setup-pipeline
          description: "Orchestrates data clean room setup pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "salesforce.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "slack.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "jira.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data collaboration workspace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Collaboration Workspace Pipeline"
  description: "Orchestrates data collaboration workspace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - terraform
    - pagerduty
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-collaboration-workspace-pipeline
          description: "Orchestrates data collaboration workspace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "terraform.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "pagerduty.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "servicenow.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data engineering cicd pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Engineering Cicd Pipeline"
  description: "Orchestrates data engineering cicd pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - datadog
    - github
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-engineering-cicd-pipeline
          description: "Orchestrates data engineering cicd pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "datadog.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "github.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "confluence.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Checks the last update timestamp of critical Snowflake tables against defined SLA thresholds. If stale, creates a PagerDuty incident and logs the SLA breach.

naftiko: "0.5"
info:
  label: "Data Freshness SLA Monitor"
  description: "Checks the last update timestamp of critical Snowflake tables against defined SLA thresholds. If stale, creates a PagerDuty incident and logs the SLA breach."
  tags:
    - data-quality
    - sla
    - snowflake
    - pagerduty
capability:
  exposes:
    - type: mcp
      namespace: sla-monitor
      port: 8080
      tools:
        - name: check-data-freshness
          description: "Verify that a Snowflake table has been updated within its SLA window and alert if stale."
          inputParameters:
            - name: table_name
              in: body
              type: string
              description: "Fully qualified table name to check."
            - name: timestamp_column
              in: body
              type: string
              description: "Column containing the last-updated timestamp."
            - name: max_age_hours
              in: body
              type: integer
              description: "Maximum acceptable age in hours before SLA breach."
            - name: escalation_policy_id
              in: body
              type: string
              description: "PagerDuty escalation policy for SLA breaches."
          steps:
            - name: check-freshness
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT MAX({{timestamp_column}}) AS LAST_UPDATED, DATEDIFF(hour, MAX({{timestamp_column}}), CURRENT_TIMESTAMP()) AS AGE_HOURS FROM {{table_name}}"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: create-incident
              type: call
              call: "pagerduty.create-incident"
              with:
                title: "SLA Breach: {{table_name}} data is {{check-freshness.data[0][1]}} hours old (limit: {{max_age_hours}}h)"
                urgency: "high"
                escalation_policy_id: "{{escalation_policy_id}}"
                body: "Table {{table_name}} last updated at {{check-freshness.data[0][0]}}. Current age: {{check-freshness.data[0][1]}} hours. SLA threshold: {{max_age_hours}} hours."
            - name: log-breach
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO OBSERVABILITY.SLA.BREACH_LOG (TABLE_NAME, LAST_UPDATED, AGE_HOURS, THRESHOLD_HOURS, DETECTED_AT) VALUES ('{{table_name}}', '{{check-freshness.data[0][0]}}', {{check-freshness.data[0][1]}}, {{max_age_hours}}, CURRENT_TIMESTAMP())"
                warehouse: "COMPUTE_WH"
                database: "OBSERVABILITY"
                schema: "SLA"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: incidents
          path: "/incidents"
          operations:
            - name: create-incident
              method: POST

Orchestrates data governance framework pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Governance Framework Pipeline"
  description: "Orchestrates data governance framework pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - github
    - confluence
    - terraform
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-governance-framework-pipeline
          description: "Orchestrates data governance framework pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "github.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "confluence.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "terraform.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data lakehouse optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Lakehouse Optimization Pipeline"
  description: "Orchestrates data lakehouse optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - github
    - confluence
    - terraform
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-lakehouse-optimization-pipeline
          description: "Orchestrates data lakehouse optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "github.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "confluence.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "terraform.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data lineage impact pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Lineage Impact Pipeline"
  description: "Orchestrates data lineage impact pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - jira
    - datadog
    - github
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-lineage-impact-pipeline
          description: "Orchestrates data lineage impact pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "jira.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "datadog.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "github.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data marketplace listing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Marketplace Listing Pipeline"
  description: "Orchestrates data marketplace listing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - slack
    - jira
    - datadog
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-marketplace-listing-pipeline
          description: "Orchestrates data marketplace listing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "slack.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "jira.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "datadog.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Applies a dynamic data masking policy to sensitive columns in a Snowflake table, verifies the policy is active, and logs the governance action to ServiceNow.

naftiko: "0.5"
info:
  label: "Data Masking Policy Applier"
  description: "Applies a dynamic data masking policy to sensitive columns in a Snowflake table, verifies the policy is active, and logs the governance action to ServiceNow."
  tags:
    - data-governance
    - security
    - snowflake
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: data-masking
      port: 8080
      tools:
        - name: apply-masking-policy
          description: "Apply a masking policy to a Snowflake column and log the action in ServiceNow."
          inputParameters:
            - name: table_name
              in: body
              type: string
              description: "Fully qualified table name."
            - name: column_name
              in: body
              type: string
              description: "Column to apply masking to."
            - name: policy_name
              in: body
              type: string
              description: "Masking policy name to apply."
            - name: assignment_group
              in: body
              type: string
              description: "ServiceNow group for governance logging."
          steps:
            - name: apply-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER TABLE {{table_name}} MODIFY COLUMN {{column_name}} SET MASKING POLICY {{policy_name}}"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: verify-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT * FROM TABLE(INFORMATION_SCHEMA.POLICY_REFERENCES(REF_ENTITY_NAME => '{{table_name}}', REF_ENTITY_DOMAIN => 'TABLE'))"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "INFORMATION_SCHEMA"
            - name: log-governance-action
              type: call
              call: "servicenow.create-task"
              with:
                short_description: "Masking policy applied: {{policy_name}} on {{table_name}}.{{column_name}}"
                description: "Dynamic masking policy '{{policy_name}}' applied to column {{column_name}} in table {{table_name}}. Verified active. Applied via automated governance workflow."
                assignment_group: "{{assignment_group}}"
                category: "data_governance"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://{{snow_instance}}.service-now.com/api/now"
      authentication:
        type: basic
        username: "$secrets.servicenow_user"
        password: "$secrets.servicenow_password"
      resources:
        - name: tasks
          path: "/table/sc_task"
          operations:
            - name: create-task
              method: POST

Orchestrates data mesh domain provisioning pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Mesh Domain Provisioning Pipeline"
  description: "Orchestrates data mesh domain provisioning pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - snowflake
    - salesforce
    - slack
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-mesh-domain-provisioning-pipeline
          description: "Orchestrates data mesh domain provisioning pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "snowflake.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "salesforce.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "slack.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data mesh self service pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Mesh Self Service Pipeline"
  description: "Orchestrates data mesh self service pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - jira
    - datadog
    - github
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-mesh-self-service-pipeline
          description: "Orchestrates data mesh self service pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "jira.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "datadog.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "github.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data observability pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Observability Pipeline"
  description: "Orchestrates data observability pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - pagerduty
    - servicenow
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-observability-pipeline
          description: "Orchestrates data observability pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "pagerduty.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "servicenow.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "snowflake.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data pipeline dependency pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Pipeline Dependency Pipeline"
  description: "Orchestrates data pipeline dependency pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - servicenow
    - snowflake
    - salesforce
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-pipeline-dependency-pipeline
          description: "Orchestrates data pipeline dependency pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "servicenow.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "snowflake.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "salesforce.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

When a Snowflake task fails, retrieves the error details from task history, creates a PagerDuty incident, and posts an alert to a Slack channel so the on-call data engineer can respond.

naftiko: "0.5"
info:
  label: "Data Pipeline Failure Alert Orchestrator"
  description: "When a Snowflake task fails, retrieves the error details from task history, creates a PagerDuty incident, and posts an alert to a Slack channel so the on-call data engineer can respond."
  tags:
    - data-engineering
    - alerting
    - snowflake
    - pagerduty
    - slack
capability:
  exposes:
    - type: mcp
      namespace: pipeline-alerting
      port: 8080
      tools:
        - name: alert-task-failure
          description: "Given a failed Snowflake task name, pull the error details, create a PagerDuty incident, and notify Slack."
          inputParameters:
            - name: task_name
              in: body
              type: string
              description: "Fully qualified Snowflake task name that failed."
            - name: escalation_policy_id
              in: body
              type: string
              description: "PagerDuty escalation policy ID for the on-call team."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel ID for pipeline alerts."
          steps:
            - name: get-task-error
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT * FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY(TASK_NAME => '{{task_name}}')) WHERE STATE = 'FAILED' ORDER BY SCHEDULED_TIME DESC LIMIT 1"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: create-incident
              type: call
              call: "pagerduty.create-incident"
              with:
                title: "Snowflake Task Failure: {{task_name}}"
                urgency: "high"
                escalation_policy_id: "{{escalation_policy_id}}"
                body: "Task {{task_name}} failed at {{get-task-error.data[0][2]}}. Error: {{get-task-error.data[0][5]}}"
            - name: notify-slack
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Pipeline Alert: Snowflake task `{{task_name}}` failed. PagerDuty incident {{create-incident.id}} created. Error: {{get-task-error.data[0][5]}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: incidents
          path: "/incidents"
          operations:
            - name: create-incident
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Orchestrates data privacy classification pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Privacy Classification Pipeline"
  description: "Orchestrates data privacy classification pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - confluence
    - terraform
    - pagerduty
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-privacy-classification-pipeline
          description: "Orchestrates data privacy classification pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "confluence.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "terraform.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "pagerduty.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data product marketplace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Product Marketplace Pipeline"
  description: "Orchestrates data product marketplace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - jira
    - datadog
    - github
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-product-marketplace-pipeline
          description: "Orchestrates data product marketplace pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "jira.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "datadog.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "github.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Runs a data quality SQL check against a Snowflake table. If anomalies are detected, creates a Jira issue for the data engineering team and notifies them in Microsoft Teams.

naftiko: "0.5"
info:
  label: "Data Quality Check and Jira Ticket Creator"
  description: "Runs a data quality SQL check against a Snowflake table. If anomalies are detected, creates a Jira issue for the data engineering team and notifies them in Microsoft Teams."
  tags:
    - data-quality
    - data-engineering
    - snowflake
    - jira
    - microsoft-teams
capability:
  exposes:
    - type: mcp
      namespace: data-quality
      port: 8080
      tools:
        - name: run-quality-check
          description: "Execute a data quality SQL assertion against Snowflake and create a Jira ticket if it fails."
          inputParameters:
            - name: check_query
              in: body
              type: string
              description: "SQL query that returns rows representing data quality violations."
            - name: check_name
              in: body
              type: string
              description: "Human-readable name for this quality check."
            - name: database
              in: body
              type: string
              description: "Snowflake database to run the check against."
            - name: jira_project_key
              in: body
              type: string
              description: "Jira project key for filing data quality issues."
            - name: teams_channel_webhook
              in: body
              type: string
              description: "Microsoft Teams incoming webhook URL for notifications."
          steps:
            - name: run-check
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "{{check_query}}"
                warehouse: "COMPUTE_WH"
                database: "{{database}}"
                schema: "PUBLIC"
            - name: create-jira-issue
              type: call
              call: "jira.create-issue"
              with:
                project_key: "{{jira_project_key}}"
                summary: "Data Quality Failure: {{check_name}}"
                description: "Quality check '{{check_name}}' detected {{run-check.data.length}} violations in database {{database}}. Query: {{check_query}}"
                issue_type: "Bug"
                priority: "High"
            - name: notify-teams
              type: call
              call: "msteams.post-webhook"
              with:
                webhook_url: "{{teams_channel_webhook}}"
                text: "Data Quality Alert: '{{check_name}}' found violations. Jira ticket {{create-jira-issue.key}} created. Review immediately."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://{{jira_domain}}.atlassian.net/rest/api/3"
      authentication:
        type: basic
        username: "$secrets.jira_user"
        password: "$secrets.jira_api_token"
      resources:
        - name: issues
          path: "/issue"
          operations:
            - name: create-issue
              method: POST
    - type: http
      namespace: msteams
      baseUri: "https://outlook.office.com/webhook"
      authentication:
        type: none
      resources:
        - name: webhook
          path: "/{{webhook_url}}"
          inputParameters:
            - name: webhook_url
              in: path
          operations:
            - name: post-webhook
              method: POST

Orchestrates data quality remediation pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Quality Remediation Pipeline"
  description: "Orchestrates data quality remediation pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - salesforce
    - slack
    - jira
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-quality-remediation-pipeline
          description: "Orchestrates data quality remediation pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "salesforce.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "slack.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "jira.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data quality sla enforcement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Quality Sla Enforcement Pipeline"
  description: "Orchestrates data quality sla enforcement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - terraform
    - pagerduty
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-quality-sla-enforcement-pipeline
          description: "Orchestrates data quality sla enforcement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "terraform.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "pagerduty.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "servicenow.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data retention policy pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Retention Policy Pipeline"
  description: "Orchestrates data retention policy pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - salesforce
    - slack
    - jira
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-retention-policy-pipeline
          description: "Orchestrates data retention policy pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "salesforce.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "slack.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "jira.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Lists all outbound data shares from the Snowflake account with their consumer details and status. Used by data product owners to monitor sharing health.

naftiko: "0.5"
info:
  label: "Data Share Status Checker"
  description: "Lists all outbound data shares from the Snowflake account with their consumer details and status. Used by data product owners to monitor sharing health."
  tags:
    - data-sharing
    - governance
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-sharing
      port: 8080
      tools:
        - name: list-outbound-shares
          description: "List all outbound data shares from the current Snowflake account."
          inputParameters:
            - name: share_name_filter
              in: body
              type: string
              description: "Optional filter pattern for share names. Use % for all."
          call: "snowflake.submit-statement"
          with:
            statement: "SHOW SHARES LIKE '{{share_name_filter}}'"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: shares
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Orchestrates data sharing agreement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Sharing Agreement Pipeline"
  description: "Orchestrates data sharing agreement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - servicenow
    - snowflake
    - salesforce
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-sharing-agreement-pipeline
          description: "Orchestrates data sharing agreement pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "servicenow.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "snowflake.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "salesforce.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Orchestrates data team access governance pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Data Team Access Governance Pipeline"
  description: "Orchestrates data team access governance pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - data
    - snowflake
    - snowflake
    - salesforce
    - slack
capability:
  exposes:
    - type: mcp
      namespace: data
      port: 8080
      tools:
        - name: data-team-access-governance-pipeline
          description: "Orchestrates data team access governance pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "snowflake.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "salesforce.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "slack.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/data"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/data"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/data"
          operations:
            - name: execute-3
              method: POST

Lists all role grants for a given Snowflake user, returning role names and granted privileges. Used by security teams for access auditing.

naftiko: "0.5"
info:
  label: "Database Role Grant Viewer"
  description: "Lists all role grants for a given Snowflake user, returning role names and granted privileges. Used by security teams for access auditing."
  tags:
    - security
    - governance
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-security
      port: 8080
      tools:
        - name: list-user-grants
          description: "Show all roles and privileges granted to a Snowflake user."
          inputParameters:
            - name: username
              in: body
              type: string
              description: "The Snowflake username to audit."
          call: "snowflake.submit-statement"
          with:
            statement: "SHOW GRANTS TO USER {{username}}"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: grants
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Orchestrates dynamic table optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Dynamic Table Optimization Pipeline"
  description: "Orchestrates dynamic table optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - dynamic
    - snowflake
    - terraform
    - pagerduty
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: dynamic
      port: 8080
      tools:
        - name: dynamic-table-optimization-pipeline
          description: "Orchestrates dynamic table optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "terraform.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "pagerduty.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "servicenow.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/dynamic"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/dynamic"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/dynamic"
          operations:
            - name: execute-3
              method: POST

Returns the refresh history for a Snowflake dynamic table, including last refresh time and data freshness. Helps data engineers verify materialized views are current.

naftiko: "0.5"
info:
  label: "Dynamic Table Refresh Status"
  description: "Returns the refresh history for a Snowflake dynamic table, including last refresh time and data freshness. Helps data engineers verify materialized views are current."
  tags:
    - data-engineering
    - dynamic-tables
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-dynamic-tables
      port: 8080
      tools:
        - name: get-dynamic-table-refresh
          description: "Retrieve refresh history for a Snowflake dynamic table."
          inputParameters:
            - name: table_name
              in: body
              type: string
              description: "Fully qualified dynamic table name (database.schema.table)."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM TABLE(INFORMATION_SCHEMA.DYNAMIC_TABLE_REFRESH_HISTORY(NAME => '{{table_name}}')) ORDER BY REFRESH_END_TIME DESC LIMIT 10"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "INFORMATION_SCHEMA"
          outputParameters:
            - name: refresh_history
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Exports search-optimized data from a Snowflake table and bulk-indexes it into Elasticsearch for full-text search capabilities.

naftiko: "0.5"
info:
  label: "Elasticsearch Index from Snowflake"
  description: "Exports search-optimized data from a Snowflake table and bulk-indexes it into Elasticsearch for full-text search capabilities."
  tags:
    - data-integration
    - search
    - snowflake
    - elasticsearch
capability:
  exposes:
    - type: mcp
      namespace: es-indexing
      port: 8080
      tools:
        - name: index-snowflake-data
          description: "Query Snowflake for search data and bulk-index results into Elasticsearch."
          inputParameters:
            - name: source_query
              in: body
              type: string
              description: "SQL query to extract data for indexing."
            - name: index_name
              in: body
              type: string
              description: "Elasticsearch index name."
            - name: database
              in: body
              type: string
              description: "Snowflake database context."
          steps:
            - name: extract-data
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "{{source_query}}"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "PUBLIC"
            - name: bulk-index
              type: call
              call: "elasticsearch.bulk-index"
              with:
                index: "{{index_name}}"
                body: "{{extract-data.data}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: elasticsearch
      baseUri: "https://{{es_host}}:9200"
      authentication:
        type: basic
        username: "$secrets.es_user"
        password: "$secrets.es_password"
      resources:
        - name: bulk
          path: "/{{index}}/_bulk"
          inputParameters:
            - name: index
              in: path
          operations:
            - name: bulk-index
              method: POST

Triggers a dbt Cloud job to run transformations against Snowflake, polls for completion, and posts the run status to a Slack channel for the data engineering team.

naftiko: "0.5"
info:
  label: "ELT Pipeline Trigger and Monitor"
  description: "Triggers a dbt Cloud job to run transformations against Snowflake, polls for completion, and posts the run status to a Slack channel for the data engineering team."
  tags:
    - data-engineering
    - elt
    - snowflake
    - dbt
    - slack
capability:
  exposes:
    - type: mcp
      namespace: elt-pipeline
      port: 8080
      tools:
        - name: trigger-dbt-run
          description: "Trigger a dbt Cloud job, wait for completion, and report results to Slack."
          inputParameters:
            - name: dbt_job_id
              in: body
              type: string
              description: "The dbt Cloud job ID to trigger."
            - name: cause
              in: body
              type: string
              description: "Reason for triggering the dbt run."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel ID to post run results."
          steps:
            - name: trigger-job
              type: call
              call: "dbt-cloud.trigger-run"
              with:
                job_id: "{{dbt_job_id}}"
                cause: "{{cause}}"
            - name: get-run-status
              type: call
              call: "dbt-cloud.get-run"
              with:
                run_id: "{{trigger-job.data.id}}"
            - name: post-result
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "dbt Cloud run {{trigger-job.data.id}} for job {{dbt_job_id}} completed with status: {{get-run-status.data.status_humanized}}. Duration: {{get-run-status.data.duration_humanized}}. Triggered by: {{cause}}"
  consumes:
    - type: http
      namespace: dbt-cloud
      baseUri: "https://cloud.getdbt.com/api/v2/accounts/{{dbt_account_id}}"
      authentication:
        type: bearer
        token: "$secrets.dbt_cloud_token"
      resources:
        - name: runs
          path: "/jobs/{{job_id}}/run"
          inputParameters:
            - name: job_id
              in: path
          operations:
            - name: trigger-run
              method: POST
        - name: run-details
          path: "/runs/{{run_id}}"
          inputParameters:
            - name: run_id
              in: path
          operations:
            - name: get-run
              method: GET
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Retrieves external table metadata data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "External Table Metadata"
  description: "Retrieves external table metadata data from the Snowflake cloud data platform systems."
  tags:
    - external
    - snowflake
    - metadata
capability:
  exposes:
    - type: mcp
      namespace: external
      port: 8080
      tools:
        - name: external-table-metadata
          description: "Retrieves external table metadata data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.external-table-metadata"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/external/table/metadata/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: external-table-metadata
              method: GET

Retrieves file format definition lookup data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "File Format Definition Lookup"
  description: "Retrieves file format definition lookup data from the Snowflake cloud data platform systems."
  tags:
    - file
    - snowflake
    - lookup
capability:
  exposes:
    - type: mcp
      namespace: file
      port: 8080
      tools:
        - name: file-format-definition-lookup
          description: "Retrieves file format definition lookup data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.file-format-definition-lookup"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/file/format/definition/lookup/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: file-format-definition-lookup
              method: GET

Retrieves function definition lookup data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Function Definition Lookup"
  description: "Retrieves function definition lookup data from the Snowflake cloud data platform systems."
  tags:
    - function
    - snowflake
    - lookup
capability:
  exposes:
    - type: mcp
      namespace: function
      port: 8080
      tools:
        - name: function-definition-lookup
          description: "Retrieves function definition lookup data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.function-definition-lookup"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/function/definition/lookup/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: function-definition-lookup
              method: GET

Orchestrates iceberg table migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Iceberg Table Migration Pipeline"
  description: "Orchestrates iceberg table migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - iceberg
    - snowflake
    - pagerduty
    - servicenow
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: iceberg
      port: 8080
      tools:
        - name: iceberg-table-migration-pipeline
          description: "Orchestrates iceberg table migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "pagerduty.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "servicenow.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "snowflake.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/iceberg"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/iceberg"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/iceberg"
          operations:
            - name: execute-3
              method: POST

Registers an external Apache Iceberg table in Snowflake from an S3 catalog location, validates the schema, and logs the registration in a Snowflake metadata table.

naftiko: "0.5"
info:
  label: "Iceberg Table Registration and Validation"
  description: "Registers an external Apache Iceberg table in Snowflake from an S3 catalog location, validates the schema, and logs the registration in a Snowflake metadata table."
  tags:
    - data-engineering
    - iceberg
    - snowflake
    - aws-s3
capability:
  exposes:
    - type: mcp
      namespace: iceberg-tables
      port: 8080
      tools:
        - name: register-iceberg-table
          description: "Create an Iceberg table in Snowflake from an S3 catalog and validate the schema."
          inputParameters:
            - name: table_name
              in: body
              type: string
              description: "Name for the Iceberg table in Snowflake."
            - name: catalog_path
              in: body
              type: string
              description: "S3 path to the Iceberg catalog metadata."
            - name: database
              in: body
              type: string
              description: "Target Snowflake database."
            - name: schema
              in: body
              type: string
              description: "Target Snowflake schema."
          steps:
            - name: create-iceberg-table
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE OR REPLACE ICEBERG TABLE {{database}}.{{schema}}.{{table_name}} EXTERNAL_VOLUME = 'S3_VOLUME' CATALOG = 'SNOWFLAKE' BASE_LOCATION = '{{catalog_path}}'"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: validate-schema
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "DESCRIBE TABLE {{database}}.{{schema}}.{{table_name}}"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: log-registration
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO {{database}}.{{schema}}.TABLE_REGISTRY (TABLE_NAME, TABLE_TYPE, CATALOG_PATH, COLUMN_COUNT, REGISTERED_AT) VALUES ('{{table_name}}', 'ICEBERG', '{{catalog_path}}', {{validate-schema.data.length}}, CURRENT_TIMESTAMP())"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "{{schema}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves available listings from the Snowflake Marketplace for a given search term. Helps data teams discover shared datasets.

naftiko: "0.5"
info:
  label: "Listing Viewer for Data Marketplace"
  description: "Retrieves available listings from the Snowflake Marketplace for a given search term. Helps data teams discover shared datasets."
  tags:
    - data-sharing
    - marketplace
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-marketplace
      port: 8080
      tools:
        - name: search-listings
          description: "Search Snowflake Marketplace listings by keyword."
          inputParameters:
            - name: search_term
              in: body
              type: string
              description: "Keyword to search for in marketplace listings."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM SNOWFLAKE.DATA_SHARING_USAGE.LISTING_CONSUMPTION_DAILY WHERE LISTING_NAME ILIKE '%{{search_term}}%' LIMIT 25"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "DATA_SHARING_USAGE"
          outputParameters:
            - name: listings
              type: array
              mapping: "$.data"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Queries Snowflake login history for a user or across the account to detect anomalous access patterns. Used by security teams for compliance audits.

naftiko: "0.5"
info:
  label: "Login History Auditor"
  description: "Queries Snowflake login history for a user or across the account to detect anomalous access patterns. Used by security teams for compliance audits."
  tags:
    - security
    - compliance
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-audit
      port: 8080
      tools:
        - name: get-login-history
          description: "Retrieve recent login attempts from Snowflake for a specific user or all users."
          inputParameters:
            - name: username
              in: body
              type: string
              description: "The Snowflake username to audit. Use ALL for all users."
            - name: hours_back
              in: body
              type: integer
              description: "Number of hours back to search."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM SNOWFLAKE.ACCOUNT_USAGE.LOGIN_HISTORY WHERE EVENT_TIMESTAMP >= DATEADD(hours, -{{hours_back}}, CURRENT_TIMESTAMP()) AND (USER_NAME = '{{username}}' OR '{{username}}' = 'ALL') ORDER BY EVENT_TIMESTAMP DESC LIMIT 100"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: login_records
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves materialized view refresh status data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Materialized View Refresh Status"
  description: "Retrieves materialized view refresh status data from the Snowflake cloud data platform systems."
  tags:
    - materialized
    - snowflake
    - status
capability:
  exposes:
    - type: mcp
      namespace: materialized
      port: 8080
      tools:
        - name: materialized-view-refresh-status
          description: "Retrieves materialized view refresh status data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.materialized-view-refresh-status"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/materialized/view/refresh/status/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: materialized-view-refresh-status
              method: GET

Extracts a training dataset from Snowflake, triggers an ML training job on Kubeflow, monitors completion, and registers the model metadata back in a Snowflake model registry table.

naftiko: "0.5"
info:
  label: "ML Model Training Pipeline"
  description: "Extracts a training dataset from Snowflake, triggers an ML training job on Kubeflow, monitors completion, and registers the model metadata back in a Snowflake model registry table."
  tags:
    - machine-learning
    - data-engineering
    - snowflake
    - kubeflow
capability:
  exposes:
    - type: mcp
      namespace: ml-training
      port: 8080
      tools:
        - name: train-model
          description: "Extract training data from Snowflake, trigger Kubeflow training, and register the model."
          inputParameters:
            - name: training_query
              in: body
              type: string
              description: "SQL query to extract training data from Snowflake."
            - name: experiment_name
              in: body
              type: string
              description: "Kubeflow experiment name."
            - name: pipeline_id
              in: body
              type: string
              description: "Kubeflow pipeline ID to execute."
            - name: model_name
              in: body
              type: string
              description: "Name for the trained model."
          steps:
            - name: extract-data
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "{{training_query}}"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "ML"
            - name: trigger-training
              type: call
              call: "kubeflow.create-run"
              with:
                experiment_name: "{{experiment_name}}"
                pipeline_id: "{{pipeline_id}}"
                params: "{\"query_id\": \"{{extract-data.statementHandle}}\"}"
            - name: register-model
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO ANALYTICS.ML.MODEL_REGISTRY (MODEL_NAME, TRAINING_RUN_ID, QUERY_ID, REGISTERED_AT, STATUS) VALUES ('{{model_name}}', '{{trigger-training.run_id}}', '{{extract-data.statementHandle}}', CURRENT_TIMESTAMP(), 'TRAINING')"
                warehouse: "ML_WH"
                database: "ANALYTICS"
                schema: "ML"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: kubeflow
      baseUri: "https://{{kubeflow_endpoint}}/pipeline/apis/v1beta1"
      authentication:
        type: bearer
        token: "$secrets.kubeflow_token"
      resources:
        - name: runs
          path: "/runs"
          operations:
            - name: create-run
              method: POST

Creates or updates a Snowflake network policy with allowed IP ranges, applies it to the account, and logs the change in ServiceNow for audit.

naftiko: "0.5"
info:
  label: "Network Policy Enforcer"
  description: "Creates or updates a Snowflake network policy with allowed IP ranges, applies it to the account, and logs the change in ServiceNow for audit."
  tags:
    - security
    - network
    - snowflake
    - servicenow
capability:
  exposes:
    - type: mcp
      namespace: network-security
      port: 8080
      tools:
        - name: enforce-network-policy
          description: "Create a Snowflake network policy with allowed IPs and log to ServiceNow."
          inputParameters:
            - name: policy_name
              in: body
              type: string
              description: "Name for the network policy."
            - name: allowed_ips
              in: body
              type: string
              description: "Comma-separated list of allowed IP addresses or CIDR ranges."
            - name: assignment_group
              in: body
              type: string
              description: "ServiceNow group for change management."
          steps:
            - name: create-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE OR REPLACE NETWORK POLICY {{policy_name}} ALLOWED_IP_LIST = ({{allowed_ips}})"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: apply-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER ACCOUNT SET NETWORK_POLICY = {{policy_name}}"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: log-change
              type: call
              call: "servicenow.create-change"
              with:
                short_description: "Snowflake network policy '{{policy_name}}' applied to account"
                description: "Network policy {{policy_name}} created with allowed IPs: {{allowed_ips}}. Applied at account level."
                assignment_group: "{{assignment_group}}"
                category: "network_security"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://{{snow_instance}}.service-now.com/api/now"
      authentication:
        type: basic
        username: "$secrets.servicenow_user"
        password: "$secrets.servicenow_password"
      resources:
        - name: changes
          path: "/table/change_request"
          operations:
            - name: create-change
              method: POST

Triggers an Apache NiFi processor group to extract data from a source system, monitors the flow status, and verifies rows landed in the target Snowflake table.

naftiko: "0.5"
info:
  label: "NiFi to Snowflake Ingestion Coordinator"
  description: "Triggers an Apache NiFi processor group to extract data from a source system, monitors the flow status, and verifies rows landed in the target Snowflake table."
  tags:
    - data-engineering
    - ingestion
    - snowflake
    - apache-nifi
capability:
  exposes:
    - type: mcp
      namespace: nifi-ingestion
      port: 8080
      tools:
        - name: trigger-nifi-flow
          description: "Start an Apache NiFi processor group and verify data arrived in Snowflake."
          inputParameters:
            - name: processor_group_id
              in: body
              type: string
              description: "NiFi processor group ID to start."
            - name: target_table
              in: body
              type: string
              description: "Fully qualified Snowflake target table."
            - name: expected_min_rows
              in: body
              type: integer
              description: "Minimum expected row count after ingestion."
          steps:
            - name: start-nifi-group
              type: call
              call: "nifi.start-processor-group"
              with:
                id: "{{processor_group_id}}"
                state: "RUNNING"
            - name: check-nifi-status
              type: call
              call: "nifi.get-processor-group"
              with:
                id: "{{processor_group_id}}"
            - name: verify-row-count
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT COUNT(*) AS ROW_COUNT FROM {{target_table}}"
                warehouse: "ETL_WH"
                database: "RAW"
                schema: "PUBLIC"
  consumes:
    - type: http
      namespace: nifi
      baseUri: "https://{{nifi_host}}/nifi-api"
      authentication:
        type: bearer
        token: "$secrets.nifi_token"
      resources:
        - name: processor-groups
          path: "/flow/process-groups/{{id}}"
          inputParameters:
            - name: id
              in: path
          operations:
            - name: start-processor-group
              method: PUT
            - name: get-processor-group
              method: GET
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves notification integration config data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Notification Integration Config"
  description: "Retrieves notification integration config data from the Snowflake cloud data platform systems."
  tags:
    - notification
    - snowflake
    - config
capability:
  exposes:
    - type: mcp
      namespace: notification
      port: 8080
      tools:
        - name: notification-integration-config
          description: "Retrieves notification integration config data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.notification-integration-config"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/notification/integration/config/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: notification-integration-config
              method: GET

Retrieves object dependency graph data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Object Dependency Graph"
  description: "Retrieves object dependency graph data from the Snowflake cloud data platform systems."
  tags:
    - object
    - snowflake
    - graph
capability:
  exposes:
    - type: mcp
      namespace: object
      port: 8080
      tools:
        - name: object-dependency-graph
          description: "Retrieves object dependency graph data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.object-dependency-graph"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/object/dependency/graph/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: object-dependency-graph
              method: GET

Retrieves password policy config data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Password Policy Config"
  description: "Retrieves password policy config data from the Snowflake cloud data platform systems."
  tags:
    - password
    - snowflake
    - config
capability:
  exposes:
    - type: mcp
      namespace: password
      port: 8080
      tools:
        - name: password-policy-config
          description: "Retrieves password policy config data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.password-policy-config"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/password/policy/config/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: password-policy-config
              method: GET

Retrieves pipe copy history data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Pipe Copy History"
  description: "Retrieves pipe copy history data from the Snowflake cloud data platform systems."
  tags:
    - pipe
    - snowflake
    - history
capability:
  exposes:
    - type: mcp
      namespace: pipe
      port: 8080
      tools:
        - name: pipe-copy-history
          description: "Retrieves pipe copy history data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.pipe-copy-history"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/pipe/copy/history/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: pipe-copy-history
              method: GET

Retrieves procedure execution history data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Procedure Execution History"
  description: "Retrieves procedure execution history data from the Snowflake cloud data platform systems."
  tags:
    - procedure
    - snowflake
    - history
capability:
  exposes:
    - type: mcp
      namespace: procedure
      port: 8080
      tools:
        - name: procedure-execution-history
          description: "Retrieves procedure execution history data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.procedure-execution-history"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/procedure/execution/history/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: procedure-execution-history
              method: GET

Collects warehouse utilization and query performance metrics from Snowflake and pushes them to a Prometheus Pushgateway for infrastructure dashboarding.

naftiko: "0.5"
info:
  label: "Prometheus Metrics Exporter for Snowflake"
  description: "Collects warehouse utilization and query performance metrics from Snowflake and pushes them to a Prometheus Pushgateway for infrastructure dashboarding."
  tags:
    - observability
    - platform
    - snowflake
    - prometheus
capability:
  exposes:
    - type: mcp
      namespace: metrics-export
      port: 8080
      tools:
        - name: export-snowflake-metrics
          description: "Collect Snowflake warehouse and query metrics and push them to Prometheus."
          inputParameters:
            - name: warehouse_name
              in: body
              type: string
              description: "Warehouse to collect metrics for."
            - name: prometheus_job
              in: body
              type: string
              description: "Prometheus job name for the pushgateway."
          steps:
            - name: get-warehouse-metrics
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT WAREHOUSE_NAME, AVG(AVG_RUNNING) AS AVG_RUNNING, AVG(AVG_QUEUED_LOAD) AS AVG_QUEUED, SUM(CREDITS_USED) AS TOTAL_CREDITS FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_LOAD_HISTORY WHERE WAREHOUSE_NAME = '{{warehouse_name}}' AND START_TIME >= DATEADD(hours, -1, CURRENT_TIMESTAMP()) GROUP BY WAREHOUSE_NAME"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: push-metrics
              type: call
              call: "prometheus.push-metrics"
              with:
                job: "{{prometheus_job}}"
                metrics: "snowflake_warehouse_avg_running {{get-warehouse-metrics.data[0][1]}}\nsnowflake_warehouse_avg_queued {{get-warehouse-metrics.data[0][2]}}\nsnowflake_warehouse_credits_used {{get-warehouse-metrics.data[0][3]}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: prometheus
      baseUri: "https://{{pushgateway_host}}"
      authentication:
        type: basic
        username: "$secrets.prometheus_user"
        password: "$secrets.prometheus_password"
      resources:
        - name: metrics
          path: "/metrics/job/{{job}}"
          inputParameters:
            - name: job
              in: path
          operations:
            - name: push-metrics
              method: POST

Creates and applies a row access policy in Snowflake for multi-tenant data isolation, verifies the policy attachment, and notifies the governance channel in Slack.

naftiko: "0.5"
info:
  label: "Row Access Policy Provisioner"
  description: "Creates and applies a row access policy in Snowflake for multi-tenant data isolation, verifies the policy attachment, and notifies the governance channel in Slack."
  tags:
    - data-governance
    - security
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: row-access
      port: 8080
      tools:
        - name: provision-row-access-policy
          description: "Create a row access policy in Snowflake, apply it to a table, and notify governance team in Slack."
          inputParameters:
            - name: policy_name
              in: body
              type: string
              description: "Name for the new row access policy."
            - name: table_name
              in: body
              type: string
              description: "Table to apply the policy to."
            - name: filter_column
              in: body
              type: string
              description: "Column to use for row filtering."
            - name: allowed_role
              in: body
              type: string
              description: "Role that should have unrestricted access."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for governance notifications."
          steps:
            - name: create-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE OR REPLACE ROW ACCESS POLICY {{policy_name}} AS (val VARCHAR) RETURNS BOOLEAN -> IS_ROLE_IN_SESSION('{{allowed_role}}') OR val = CURRENT_ROLE()"
                warehouse: "ADMIN_WH"
                database: "GOVERNANCE"
                schema: "POLICIES"
            - name: apply-policy
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER TABLE {{table_name}} ADD ROW ACCESS POLICY {{policy_name}} ON ({{filter_column}})"
                warehouse: "ADMIN_WH"
                database: "GOVERNANCE"
                schema: "POLICIES"
            - name: notify-governance
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Row Access Policy '{{policy_name}}' created and applied to {{table_name}} on column {{filter_column}}. Unrestricted role: {{allowed_role}}."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Extracts updated records from Salesforce, stages them in an S3 bucket, and triggers a Snowpipe to load the data into a Snowflake staging table for downstream analytics.

naftiko: "0.5"
info:
  label: "Salesforce Data Sync to Snowflake"
  description: "Extracts updated records from Salesforce, stages them in an S3 bucket, and triggers a Snowpipe to load the data into a Snowflake staging table for downstream analytics."
  tags:
    - data-integration
    - data-engineering
    - snowflake
    - salesforce
capability:
  exposes:
    - type: mcp
      namespace: sf-sync
      port: 8080
      tools:
        - name: sync-salesforce-to-snowflake
          description: "Pull recent Salesforce records and load them into Snowflake via Snowpipe."
          inputParameters:
            - name: sobject
              in: body
              type: string
              description: "Salesforce object API name (e.g., Account, Opportunity)."
            - name: since_date
              in: body
              type: string
              description: "ISO 8601 date to filter records modified since."
            - name: pipe_name
              in: body
              type: string
              description: "Fully qualified Snowpipe name to trigger."
          steps:
            - name: query-salesforce
              type: call
              call: "salesforce.query-records"
              with:
                q: "SELECT Id, Name, LastModifiedDate FROM {{sobject}} WHERE LastModifiedDate >= {{since_date}}"
            - name: trigger-pipe
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER PIPE {{pipe_name}} REFRESH"
                warehouse: "ETL_WH"
                database: "RAW"
                schema: "SALESFORCE"
            - name: verify-load
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT SYSTEM$PIPE_STATUS('{{pipe_name}}')"
                warehouse: "ETL_WH"
                database: "RAW"
                schema: "SALESFORCE"
  consumes:
    - type: http
      namespace: salesforce
      baseUri: "https://{{sf_instance}}.salesforce.com/services/data/v59.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_token"
      resources:
        - name: query
          path: "/query"
          inputParameters:
            - name: q
              in: query
          operations:
            - name: query-records
              method: GET
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves sequence value lookup data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Sequence Value Lookup"
  description: "Retrieves sequence value lookup data from the Snowflake cloud data platform systems."
  tags:
    - sequence
    - snowflake
    - lookup
capability:
  exposes:
    - type: mcp
      namespace: sequence
      port: 8080
      tools:
        - name: sequence-value-lookup
          description: "Retrieves sequence value lookup data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.sequence-value-lookup"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/sequence/value/lookup/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: sequence-value-lookup
              method: GET

Retrieves session policy lookup data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Session Policy Lookup"
  description: "Retrieves session policy lookup data from the Snowflake cloud data platform systems."
  tags:
    - session
    - snowflake
    - lookup
capability:
  exposes:
    - type: mcp
      namespace: session
      port: 8080
      tools:
        - name: session-policy-lookup
          description: "Retrieves session policy lookup data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.session-policy-lookup"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/session/policy/lookup/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: session-policy-lookup
              method: GET

Identifies the top slow-running queries in Snowflake over a given period, fetches their execution plans, and posts a summary to a Datadog dashboard event stream for performance monitoring.

naftiko: "0.5"
info:
  label: "Slow Query Investigator"
  description: "Identifies the top slow-running queries in Snowflake over a given period, fetches their execution plans, and posts a summary to a Datadog dashboard event stream for performance monitoring."
  tags:
    - performance
    - data-warehousing
    - snowflake
    - datadog
capability:
  exposes:
    - type: mcp
      namespace: query-performance
      port: 8080
      tools:
        - name: investigate-slow-queries
          description: "Find top slow queries in Snowflake and push a summary event to Datadog."
          inputParameters:
            - name: hours_back
              in: body
              type: integer
              description: "Number of hours to look back for slow queries."
            - name: min_duration_seconds
              in: body
              type: integer
              description: "Minimum execution time in seconds to flag as slow."
            - name: dd_tags
              in: body
              type: string
              description: "Comma-separated Datadog tags for the event."
          steps:
            - name: find-slow-queries
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT QUERY_ID, QUERY_TEXT, TOTAL_ELAPSED_TIME/1000 AS DURATION_SEC, WAREHOUSE_NAME, USER_NAME, START_TIME FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY WHERE START_TIME >= DATEADD(hours, -{{hours_back}}, CURRENT_TIMESTAMP()) AND TOTAL_ELAPSED_TIME > {{min_duration_seconds}} * 1000 ORDER BY TOTAL_ELAPSED_TIME DESC LIMIT 20"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: post-to-datadog
              type: call
              call: "datadog.create-event"
              with:
                title: "Snowflake Slow Query Report (last {{hours_back}}h)"
                text: "Found {{find-slow-queries.data.length}} queries exceeding {{min_duration_seconds}}s threshold. Top query: {{find-slow-queries.data[0][1]}} ({{find-slow-queries.data[0][2]}}s)"
                tags: "{{dd_tags}}"
                alert_type: "warning"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v1"
      authentication:
        type: apiKey
        key: "$secrets.datadog_api_key"
      inputParameters:
        - name: DD-APPLICATION-KEY
          in: header
          value: "$secrets.datadog_app_key"
      resources:
        - name: events
          path: "/events"
          operations:
            - name: create-event
              method: POST

Orchestrates snowflake account migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Account Migration Pipeline"
  description: "Orchestrates snowflake account migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - github
    - confluence
    - terraform
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-account-migration-pipeline
          description: "Orchestrates snowflake account migration pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "github.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "confluence.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "terraform.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Generates a comprehensive account usage report from Snowflake covering storage, compute, and user activity, then emails the report to stakeholders via SendGrid.

naftiko: "0.5"
info:
  label: "Snowflake Account Usage Report Generator"
  description: "Generates a comprehensive account usage report from Snowflake covering storage, compute, and user activity, then emails the report to stakeholders via SendGrid."
  tags:
    - platform
    - reporting
    - snowflake
    - sendgrid
capability:
  exposes:
    - type: mcp
      namespace: usage-reporting
      port: 8080
      tools:
        - name: generate-usage-report
          description: "Compile Snowflake account usage metrics and email the report."
          inputParameters:
            - name: report_month
              in: body
              type: string
              description: "Month for the report in YYYY-MM format."
            - name: recipient_emails
              in: body
              type: string
              description: "Comma-separated email addresses for report delivery."
          steps:
            - name: get-storage
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT DATABASE_NAME, AVERAGE_DATABASE_BYTES / POWER(1024,3) AS AVG_GB FROM SNOWFLAKE.ACCOUNT_USAGE.DATABASE_STORAGE_USAGE_HISTORY WHERE USAGE_DATE >= '{{report_month}}-01' AND USAGE_DATE < DATEADD(month, 1, '{{report_month}}-01') GROUP BY DATABASE_NAME ORDER BY AVG_GB DESC"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: get-compute
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT WAREHOUSE_NAME, SUM(CREDITS_USED) AS TOTAL_CREDITS FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY WHERE START_TIME >= '{{report_month}}-01' AND START_TIME < DATEADD(month, 1, '{{report_month}}-01') GROUP BY WAREHOUSE_NAME ORDER BY TOTAL_CREDITS DESC"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: get-active-users
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT COUNT(DISTINCT USER_NAME) AS ACTIVE_USERS FROM SNOWFLAKE.ACCOUNT_USAGE.LOGIN_HISTORY WHERE EVENT_TIMESTAMP >= '{{report_month}}-01' AND EVENT_TIMESTAMP < DATEADD(month, 1, '{{report_month}}-01') AND IS_SUCCESS = 'YES'"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: send-report
              type: call
              call: "sendgrid.send-email"
              with:
                to: "{{recipient_emails}}"
                subject: "Snowflake Account Usage Report - {{report_month}}"
                body: "Monthly Snowflake Report for {{report_month}}. Active Users: {{get-active-users.data[0][0]}}. Top warehouse credits and storage details attached."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: sendgrid
      baseUri: "https://api.sendgrid.com/v3"
      authentication:
        type: bearer
        token: "$secrets.sendgrid_api_key"
      resources:
        - name: mail
          path: "/mail/send"
          operations:
            - name: send-email
              method: POST

Creates a zero-copy clone of a production Snowflake database for development, applies masking policies, and notifies the requesting developer in Microsoft Teams.

naftiko: "0.5"
info:
  label: "Snowflake Clone for Dev Environment"
  description: "Creates a zero-copy clone of a production Snowflake database for development, applies masking policies, and notifies the requesting developer in Microsoft Teams."
  tags:
    - platform
    - data-engineering
    - snowflake
    - microsoft-teams
capability:
  exposes:
    - type: mcp
      namespace: env-cloning
      port: 8080
      tools:
        - name: clone-database
          description: "Create a zero-copy clone of a Snowflake database and notify the requestor."
          inputParameters:
            - name: source_database
              in: body
              type: string
              description: "Production database to clone."
            - name: clone_name
              in: body
              type: string
              description: "Name for the cloned database."
            - name: requestor_email
              in: body
              type: string
              description: "Email of the developer requesting the clone."
          steps:
            - name: create-clone
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE DATABASE {{clone_name}} CLONE {{source_database}}"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: grant-access
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "GRANT USAGE ON DATABASE {{clone_name}} TO ROLE DEV_ROLE; GRANT USAGE ON ALL SCHEMAS IN DATABASE {{clone_name}} TO ROLE DEV_ROLE; GRANT SELECT ON ALL TABLES IN DATABASE {{clone_name}} TO ROLE DEV_ROLE"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: notify-developer
              type: call
              call: "msteams.send-message"
              with:
                recipient_upn: "{{requestor_email}}"
                text: "Your Snowflake dev clone '{{clone_name}}' from '{{source_database}}' is ready. Access granted via DEV_ROLE. Clone will be auto-dropped in 7 days."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: msteams
      baseUri: "https://graph.microsoft.com/v1.0"
      authentication:
        type: bearer
        token: "$secrets.msgraph_token"
      resources:
        - name: messages
          path: "/users/{{recipient_upn}}/sendMail"
          inputParameters:
            - name: recipient_upn
              in: path
          operations:
            - name: send-message
              method: POST

Compares current warehouse credit usage against historical averages. When spend exceeds a threshold, sends a Slack alert and logs the anomaly in a Snowflake audit table.

naftiko: "0.5"
info:
  label: "Snowflake Cost Anomaly Detector and Notifier"
  description: "Compares current warehouse credit usage against historical averages. When spend exceeds a threshold, sends a Slack alert and logs the anomaly in a Snowflake audit table."
  tags:
    - cost-management
    - platform
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: cost-anomaly
      port: 8080
      tools:
        - name: detect-cost-anomaly
          description: "Compare recent Snowflake credit usage against the historical baseline and alert on anomalies via Slack."
          inputParameters:
            - name: warehouse_name
              in: body
              type: string
              description: "Warehouse to monitor for cost anomalies."
            - name: threshold_multiplier
              in: body
              type: number
              description: "Multiplier above average that triggers an alert (e.g., 2.0 for 2x average)."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for cost anomaly alerts."
          steps:
            - name: get-recent-usage
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT SUM(CREDITS_USED) AS RECENT_CREDITS FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY WHERE WAREHOUSE_NAME = '{{warehouse_name}}' AND START_TIME >= DATEADD(hours, -24, CURRENT_TIMESTAMP())"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: get-baseline
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT AVG(DAILY_CREDITS) AS AVG_CREDITS FROM (SELECT DATE_TRUNC('day', START_TIME) AS DAY, SUM(CREDITS_USED) AS DAILY_CREDITS FROM SNOWFLAKE.ACCOUNT_USAGE.WAREHOUSE_METERING_HISTORY WHERE WAREHOUSE_NAME = '{{warehouse_name}}' AND START_TIME >= DATEADD(days, -30, CURRENT_TIMESTAMP()) GROUP BY DAY)"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: alert-slack
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Cost Anomaly: Warehouse '{{warehouse_name}}' used {{get-recent-usage.data[0][0]}} credits in the last 24h vs {{get-baseline.data[0][0]}} daily average. Threshold multiplier: {{threshold_multiplier}}x."
            - name: log-anomaly
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO OBSERVABILITY.COST.ANOMALY_LOG (DETECTED_AT, WAREHOUSE_NAME, RECENT_CREDITS, BASELINE_CREDITS, THRESHOLD) VALUES (CURRENT_TIMESTAMP(), '{{warehouse_name}}', {{get-recent-usage.data[0][0]}}, {{get-baseline.data[0][0]}}, {{threshold_multiplier}})"
                warehouse: "COMPUTE_WH"
                database: "OBSERVABILITY"
                schema: "COST"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Queries Snowflake Access History to trace data lineage for a table, then publishes the lineage graph to a Confluence documentation page for governance teams.

naftiko: "0.5"
info:
  label: "Snowflake Data Lineage Reporter"
  description: "Queries Snowflake Access History to trace data lineage for a table, then publishes the lineage graph to a Confluence documentation page for governance teams."
  tags:
    - data-governance
    - lineage
    - snowflake
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: data-lineage
      port: 8080
      tools:
        - name: trace-lineage
          description: "Trace data lineage for a Snowflake table and publish to Confluence."
          inputParameters:
            - name: table_name
              in: body
              type: string
              description: "Fully qualified table name to trace lineage for."
            - name: days_back
              in: body
              type: integer
              description: "Number of days of access history to analyze."
            - name: confluence_page_id
              in: body
              type: string
              description: "Confluence page ID for lineage documentation."
          steps:
            - name: get-lineage
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT DISTINCT DIRECT_OBJECTS_ACCESSED, BASE_OBJECTS_ACCESSED, OBJECTS_MODIFIED, USER_NAME, QUERY_START_TIME FROM SNOWFLAKE.ACCOUNT_USAGE.ACCESS_HISTORY WHERE ARRAY_CONTAINS('{{table_name}}'::VARIANT, OBJECTS_MODIFIED:objectName) AND QUERY_START_TIME >= DATEADD(days, -{{days_back}}, CURRENT_TIMESTAMP()) ORDER BY QUERY_START_TIME DESC LIMIT 50"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: publish-to-confluence
              type: call
              call: "confluence.update-page"
              with:
                page_id: "{{confluence_page_id}}"
                title: "Data Lineage: {{table_name}}"
                body: "Lineage analysis for {{table_name}} over the last {{days_back}} days. {{get-lineage.data.length}} transformation operations detected."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://{{confluence_domain}}.atlassian.net/wiki/api/v2"
      authentication:
        type: basic
        username: "$secrets.confluence_user"
        password: "$secrets.confluence_api_token"
      resources:
        - name: pages
          path: "/pages/{{page_id}}"
          inputParameters:
            - name: page_id
              in: path
          operations:
            - name: update-page
              method: PUT

Orchestrates snowflake disaster recovery pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Disaster Recovery Pipeline"
  description: "Orchestrates snowflake disaster recovery pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - servicenow
    - snowflake
    - salesforce
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-disaster-recovery-pipeline
          description: "Orchestrates snowflake disaster recovery pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "servicenow.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "snowflake.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "salesforce.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Creates an external function in Snowflake that proxies to an AWS Lambda via API Gateway, verifies the integration, and documents the function in Confluence.

naftiko: "0.5"
info:
  label: "Snowflake External Function Deployer"
  description: "Creates an external function in Snowflake that proxies to an AWS Lambda via API Gateway, verifies the integration, and documents the function in Confluence."
  tags:
    - platform
    - serverless
    - snowflake
    - aws-lambda
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: external-functions
      port: 8080
      tools:
        - name: deploy-external-function
          description: "Create a Snowflake external function backed by AWS Lambda and document in Confluence."
          inputParameters:
            - name: function_name
              in: body
              type: string
              description: "Name for the Snowflake external function."
            - name: api_integration
              in: body
              type: string
              description: "Snowflake API integration name for the Lambda proxy."
            - name: lambda_url
              in: body
              type: string
              description: "API Gateway endpoint URL for the Lambda function."
            - name: confluence_page_id
              in: body
              type: string
              description: "Confluence page for external function documentation."
          steps:
            - name: create-function
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CREATE OR REPLACE EXTERNAL FUNCTION {{function_name}}(input VARIANT) RETURNS VARIANT API_INTEGRATION = {{api_integration}} AS '{{lambda_url}}'"
                warehouse: "ADMIN_WH"
                database: "INTEGRATIONS"
                schema: "EXTERNAL_FUNCTIONS"
            - name: test-function
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT {{function_name}}(PARSE_JSON('{\"test\": true}')) AS TEST_RESULT"
                warehouse: "ADMIN_WH"
                database: "INTEGRATIONS"
                schema: "EXTERNAL_FUNCTIONS"
            - name: document-function
              type: call
              call: "confluence.update-page"
              with:
                page_id: "{{confluence_page_id}}"
                title: "External Function: {{function_name}}"
                body: "External function {{function_name}} deployed. API Integration: {{api_integration}}. Lambda URL: {{lambda_url}}. Test result: {{test-function.data[0][0]}}."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://{{confluence_domain}}.atlassian.net/wiki/api/v2"
      authentication:
        type: basic
        username: "$secrets.confluence_user"
        password: "$secrets.confluence_api_token"
      resources:
        - name: pages
          path: "/pages/{{page_id}}"
          inputParameters:
            - name: page_id
              in: path
          operations:
            - name: update-page
              method: PUT

Orchestrates snowflake feature store pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Feature Store Pipeline"
  description: "Orchestrates snowflake feature store pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - snowflake
    - salesforce
    - slack
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-feature-store-pipeline
          description: "Orchestrates snowflake feature store pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "snowflake.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "salesforce.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "slack.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Orchestrates snowflake hybrid table pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Hybrid Table Pipeline"
  description: "Orchestrates snowflake hybrid table pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - confluence
    - terraform
    - pagerduty
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-hybrid-table-pipeline
          description: "Orchestrates snowflake hybrid table pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "confluence.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "terraform.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "pagerduty.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Orchestrates snowflake native app deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Native App Deployment Pipeline"
  description: "Orchestrates snowflake native app deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - confluence
    - terraform
    - pagerduty
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-native-app-deployment-pipeline
          description: "Orchestrates snowflake native app deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "confluence.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "terraform.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "pagerduty.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: terraform
      baseUri: "https://app.terraform.io/api/v2"
      authentication:
        type: bearer
        token: "$secrets.terraform_token"
      resources:
        - name: terraform-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Executes a Snowflake Notebook stored procedure, captures the output, and archives the results in an S3 bucket for compliance retention.

naftiko: "0.5"
info:
  label: "Snowflake Notebook Execution and Results Archiver"
  description: "Executes a Snowflake Notebook stored procedure, captures the output, and archives the results in an S3 bucket for compliance retention."
  tags:
    - data-engineering
    - notebooks
    - snowflake
    - aws-s3
capability:
  exposes:
    - type: mcp
      namespace: notebook-execution
      port: 8080
      tools:
        - name: run-notebook
          description: "Execute a Snowflake notebook stored procedure and archive output to S3."
          inputParameters:
            - name: notebook_proc
              in: body
              type: string
              description: "Fully qualified stored procedure name for the notebook."
            - name: s3_archive_path
              in: body
              type: string
              description: "S3 path for archiving results."
            - name: database
              in: body
              type: string
              description: "Database containing the notebook procedure."
            - name: schema
              in: body
              type: string
              description: "Schema containing the notebook procedure."
          steps:
            - name: execute-notebook
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CALL {{notebook_proc}}()"
                warehouse: "ML_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: archive-results
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "COPY INTO '{{s3_archive_path}}/{{execute-notebook.statementHandle}}/' FROM (SELECT * FROM TABLE(RESULT_SCAN('{{execute-notebook.statementHandle}}'))) FILE_FORMAT = (TYPE = JSON) STORAGE_INTEGRATION = S3_INTEGRATION OVERWRITE = TRUE"
                warehouse: "ML_WH"
                database: "{{database}}"
                schema: "{{schema}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Orchestrates snowflake polaris catalog pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Polaris Catalog Pipeline"
  description: "Orchestrates snowflake polaris catalog pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - slack
    - jira
    - datadog
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-polaris-catalog-pipeline
          description: "Orchestrates snowflake polaris catalog pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "slack.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "jira.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "datadog.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Checks the replication status of a Snowflake failover group across regions and alerts the platform team in Slack if replication lag exceeds thresholds.

naftiko: "0.5"
info:
  label: "Snowflake Replication Group Monitor"
  description: "Checks the replication status of a Snowflake failover group across regions and alerts the platform team in Slack if replication lag exceeds thresholds."
  tags:
    - platform
    - disaster-recovery
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: replication-monitor
      port: 8080
      tools:
        - name: check-replication-lag
          description: "Monitor Snowflake replication group lag and alert on threshold breach."
          inputParameters:
            - name: replication_group
              in: body
              type: string
              description: "Name of the Snowflake replication or failover group."
            - name: max_lag_minutes
              in: body
              type: integer
              description: "Maximum acceptable replication lag in minutes."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for replication alerts."
          steps:
            - name: check-lag
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT REPLICATION_GROUP_NAME, PHASE, PRIMARY_SNAPSHOT_TIMESTAMP, SECONDARY_SNAPSHOT_TIMESTAMP, DATEDIFF(minute, PRIMARY_SNAPSHOT_TIMESTAMP, SECONDARY_SNAPSHOT_TIMESTAMP) AS LAG_MINUTES FROM TABLE(INFORMATION_SCHEMA.REPLICATION_GROUP_REFRESH_PROGRESS('{{replication_group}}'))"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "INFORMATION_SCHEMA"
            - name: alert-if-lagging
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Replication Alert: Group '{{replication_group}}' lag is {{check-lag.data[0][4]}} minutes (threshold: {{max_lag_minutes}}m). Phase: {{check-lag.data[0][1]}}. Investigate immediately."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Checks Snowflake resource monitor thresholds, and when credit limits are approaching, sends a notification to Slack and suspends the warehouse to prevent budget overruns.

naftiko: "0.5"
info:
  label: "Snowflake Resource Monitor Alert"
  description: "Checks Snowflake resource monitor thresholds, and when credit limits are approaching, sends a notification to Slack and suspends the warehouse to prevent budget overruns."
  tags:
    - cost-management
    - platform
    - snowflake
    - slack
capability:
  exposes:
    - type: mcp
      namespace: resource-monitor
      port: 8080
      tools:
        - name: check-resource-monitors
          description: "Check Snowflake resource monitor usage and suspend warehouses approaching their credit limit."
          inputParameters:
            - name: monitor_name
              in: body
              type: string
              description: "Snowflake resource monitor name to check."
            - name: slack_channel
              in: body
              type: string
              description: "Slack channel for resource monitor alerts."
          steps:
            - name: get-monitor-usage
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SHOW RESOURCE MONITORS LIKE '{{monitor_name}}'"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: suspend-warehouse
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "ALTER WAREHOUSE {{monitor_name}}_WH SUSPEND"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: notify-slack
              type: call
              call: "slack.post-message"
              with:
                channel: "{{slack_channel}}"
                text: "Resource Monitor Alert: '{{monitor_name}}' is approaching credit limit. Warehouse suspended to prevent overrun. Current usage: {{get-monitor-usage.data[0][4]}} credits."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: messages
          path: "/chat.postMessage"
          operations:
            - name: post-message
              method: POST

Detects recent DDL changes in Snowflake by querying the query history for ALTER and CREATE statements, then posts a summary to a Confluence page for change management documentation.

naftiko: "0.5"
info:
  label: "Snowflake Schema Change Tracker"
  description: "Detects recent DDL changes in Snowflake by querying the query history for ALTER and CREATE statements, then posts a summary to a Confluence page for change management documentation."
  tags:
    - data-governance
    - change-management
    - snowflake
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: schema-changes
      port: 8080
      tools:
        - name: track-schema-changes
          description: "Detect recent DDL changes in Snowflake and document them in Confluence."
          inputParameters:
            - name: database
              in: body
              type: string
              description: "Snowflake database to audit for DDL changes."
            - name: hours_back
              in: body
              type: integer
              description: "Number of hours to look back."
            - name: confluence_page_id
              in: body
              type: string
              description: "Confluence page ID to update with change log."
          steps:
            - name: find-ddl-changes
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT QUERY_TEXT, USER_NAME, START_TIME, QUERY_TYPE FROM SNOWFLAKE.ACCOUNT_USAGE.QUERY_HISTORY WHERE DATABASE_NAME = '{{database}}' AND QUERY_TYPE IN ('CREATE_TABLE', 'ALTER_TABLE', 'DROP_TABLE', 'CREATE_VIEW', 'ALTER_VIEW') AND START_TIME >= DATEADD(hours, -{{hours_back}}, CURRENT_TIMESTAMP()) ORDER BY START_TIME DESC"
                warehouse: "COMPUTE_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: update-confluence
              type: call
              call: "confluence.update-page"
              with:
                page_id: "{{confluence_page_id}}"
                title: "Snowflake Schema Changes - {{database}}"
                body: "DDL changes detected in the last {{hours_back}} hours: {{find-ddl-changes.data.length}} statements found. Most recent: {{find-ddl-changes.data[0][0]}} by {{find-ddl-changes.data[0][1]}} at {{find-ddl-changes.data[0][2]}}."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://{{confluence_domain}}.atlassian.net/wiki/api/v2"
      authentication:
        type: basic
        username: "$secrets.confluence_user"
        password: "$secrets.confluence_api_token"
      resources:
        - name: pages
          path: "/pages/{{page_id}}"
          inputParameters:
            - name: page_id
              in: path
          operations:
            - name: update-page
              method: PUT

Orchestrates snowflake security hardening pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Security Hardening Pipeline"
  description: "Orchestrates snowflake security hardening pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - datadog
    - github
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-security-hardening-pipeline
          description: "Orchestrates snowflake security hardening pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "datadog.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "github.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "confluence.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Extracts curated data from Snowflake and writes it to a PostgreSQL operational database for application use. Logs the sync status back to Snowflake.

naftiko: "0.5"
info:
  label: "Snowflake to PostgreSQL Reverse ETL"
  description: "Extracts curated data from Snowflake and writes it to a PostgreSQL operational database for application use. Logs the sync status back to Snowflake."
  tags:
    - data-integration
    - reverse-etl
    - snowflake
    - postgresql
capability:
  exposes:
    - type: mcp
      namespace: reverse-etl
      port: 8080
      tools:
        - name: sync-to-postgres
          description: "Extract data from Snowflake and upsert into a PostgreSQL table."
          inputParameters:
            - name: source_query
              in: body
              type: string
              description: "SQL query to extract data from Snowflake."
            - name: pg_table
              in: body
              type: string
              description: "Target PostgreSQL table name."
            - name: database
              in: body
              type: string
              description: "Snowflake source database."
          steps:
            - name: extract-from-snowflake
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "{{source_query}}"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "PUBLIC"
            - name: upsert-to-postgres
              type: call
              call: "postgres.execute-query"
              with:
                query: "INSERT INTO {{pg_table}} SELECT * FROM staging_data ON CONFLICT DO UPDATE"
            - name: log-sync
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO {{database}}.PUBLIC.REVERSE_ETL_LOG (SYNC_TIME, TARGET_TABLE, QUERY_ID, STATUS) VALUES (CURRENT_TIMESTAMP(), '{{pg_table}}', '{{extract-from-snowflake.statementHandle}}', 'SUCCESS')"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "PUBLIC"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: postgres
      baseUri: "https://{{pg_rest_host}}/api/v1"
      authentication:
        type: bearer
        token: "$secrets.pg_rest_token"
      resources:
        - name: query
          path: "/query"
          operations:
            - name: execute-query
              method: POST

Unloads query results from a Snowflake table to an AWS S3 bucket as Parquet files, then logs the export metadata to a Snowflake audit table.

naftiko: "0.5"
info:
  label: "Snowflake to S3 Data Export Orchestrator"
  description: "Unloads query results from a Snowflake table to an AWS S3 bucket as Parquet files, then logs the export metadata to a Snowflake audit table."
  tags:
    - data-engineering
    - data-export
    - snowflake
    - aws-s3
capability:
  exposes:
    - type: mcp
      namespace: data-export
      port: 8080
      tools:
        - name: export-to-s3
          description: "Unload Snowflake query results to S3 as Parquet and log the export."
          inputParameters:
            - name: source_query
              in: body
              type: string
              description: "SQL query defining the data to export."
            - name: s3_path
              in: body
              type: string
              description: "S3 destination path (e.g., s3://bucket/prefix/)."
            - name: database
              in: body
              type: string
              description: "Snowflake database context."
            - name: schema
              in: body
              type: string
              description: "Snowflake schema context."
          steps:
            - name: unload-data
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "COPY INTO '{{s3_path}}' FROM ({{source_query}}) STORAGE_INTEGRATION = S3_INTEGRATION FILE_FORMAT = (TYPE = PARQUET) OVERWRITE = TRUE HEADER = TRUE"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "{{schema}}"
            - name: log-export
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "INSERT INTO {{database}}.{{schema}}.EXPORT_AUDIT_LOG (EXPORT_TIME, QUERY_ID, S3_PATH, ROW_COUNT) SELECT CURRENT_TIMESTAMP(), '{{unload-data.statementHandle}}', '{{s3_path}}', ROWS_PRODUCED FROM TABLE(RESULT_SCAN('{{unload-data.statementHandle}}'))"
                warehouse: "ETL_WH"
                database: "{{database}}"
                schema: "{{schema}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Orchestrates snowflake udf testing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Udf Testing Pipeline"
  description: "Orchestrates snowflake udf testing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - datadog
    - github
    - confluence
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-udf-testing-pipeline
          description: "Orchestrates snowflake udf testing pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "datadog.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "github.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "confluence.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: github-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: confluence
      baseUri: "https://snowflake.atlassian.net/wiki/rest/api"
      authentication:
        type: bearer
        token: "$secrets.confluence_token"
      resources:
        - name: confluence-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Orchestrates snowflake usage chargeback pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowflake Usage Chargeback Pipeline"
  description: "Orchestrates snowflake usage chargeback pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowflake
    - snowflake
    - pagerduty
    - servicenow
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake
      port: 8080
      tools:
        - name: snowflake-usage-chargeback-pipeline
          description: "Orchestrates snowflake usage chargeback pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "pagerduty.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "servicenow.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "snowflake.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: pagerduty
      baseUri: "https://api.pagerduty.com"
      authentication:
        type: bearer
        token: "$secrets.pagerduty_token"
      resources:
        - name: pagerduty-resource
          path: "/api/snowflake"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: servicenow
      baseUri: "https://snowflake.service-now.com/api/now"
      authentication:
        type: bearer
        token: "$secrets.servicenow_token"
      resources:
        - name: servicenow-resource
          path: "/api/snowflake"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/snowflake"
          operations:
            - name: execute-3
              method: POST

Orchestrates snowpark container deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowpark Container Deployment Pipeline"
  description: "Orchestrates snowpark container deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowpark
    - snowflake
    - salesforce
    - slack
    - jira
capability:
  exposes:
    - type: mcp
      namespace: snowpark
      port: 8080
      tools:
        - name: snowpark-container-deployment-pipeline
          description: "Orchestrates snowpark container deployment pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "salesforce.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "slack.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "jira.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/snowpark"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/snowpark"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/snowpark"
          operations:
            - name: execute-3
              method: POST

Executes a Snowpark Python stored procedure for data transformation, checks the execution result, and posts the outcome to a Microsoft Teams channel.

naftiko: "0.5"
info:
  label: "Snowpark Job Executor and Notifier"
  description: "Executes a Snowpark Python stored procedure for data transformation, checks the execution result, and posts the outcome to a Microsoft Teams channel."
  tags:
    - data-engineering
    - snowpark
    - snowflake
    - microsoft-teams
capability:
  exposes:
    - type: mcp
      namespace: snowpark-execution
      port: 8080
      tools:
        - name: run-snowpark-job
          description: "Execute a Snowpark stored procedure and notify Microsoft Teams with the result."
          inputParameters:
            - name: procedure_name
              in: body
              type: string
              description: "Fully qualified Snowpark stored procedure name."
            - name: proc_args
              in: body
              type: string
              description: "Arguments to pass to the procedure."
            - name: teams_webhook
              in: body
              type: string
              description: "Microsoft Teams webhook URL."
          steps:
            - name: execute-proc
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "CALL {{procedure_name}}({{proc_args}})"
                warehouse: "ETL_WH"
                database: "ANALYTICS"
                schema: "TRANSFORMS"
            - name: get-result
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SELECT * FROM TABLE(RESULT_SCAN('{{execute-proc.statementHandle}}'))"
                warehouse: "ETL_WH"
                database: "ANALYTICS"
                schema: "TRANSFORMS"
            - name: notify-teams
              type: call
              call: "msteams.post-webhook"
              with:
                webhook_url: "{{teams_webhook}}"
                text: "Snowpark Job Complete: {{procedure_name}} finished. Query ID: {{execute-proc.statementHandle}}. Status: {{execute-proc.status}}."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: msteams
      baseUri: "https://outlook.office.com/webhook"
      authentication:
        type: none
      resources:
        - name: webhook
          path: "/{{webhook_url}}"
          inputParameters:
            - name: webhook_url
              in: path
          operations:
            - name: post-webhook
              method: POST

Orchestrates snowpark ml pipeline orchestrator across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Snowpark Ml Pipeline Orchestrator"
  description: "Orchestrates snowpark ml pipeline orchestrator across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - snowpark
    - snowflake
    - slack
    - jira
    - datadog
capability:
  exposes:
    - type: mcp
      namespace: snowpark
      port: 8080
      tools:
        - name: snowpark-ml-pipeline-orchestrator
          description: "Orchestrates snowpark ml pipeline orchestrator across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "slack.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "jira.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "datadog.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/snowpark"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: jira
      baseUri: "https://snowflake.atlassian.net/rest/api/3"
      authentication:
        type: bearer
        token: "$secrets.jira_token"
      resources:
        - name: jira-resource
          path: "/api/snowpark"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: datadog
      baseUri: "https://api.datadoghq.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.datadog_api_key"
      resources:
        - name: datadog-resource
          path: "/api/snowpark"
          operations:
            - name: execute-3
              method: POST

Retrieves the copy history for a Snowpipe, showing recently ingested files, row counts, and any load errors. Used to monitor continuous data ingestion.

naftiko: "0.5"
info:
  label: "Snowpipe Ingestion Status"
  description: "Retrieves the copy history for a Snowpipe, showing recently ingested files, row counts, and any load errors. Used to monitor continuous data ingestion."
  tags:
    - data-engineering
    - ingestion
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-snowpipe
      port: 8080
      tools:
        - name: get-pipe-status
          description: "Fetch the recent copy history for a Snowpipe to check ingestion status."
          inputParameters:
            - name: pipe_name
              in: body
              type: string
              description: "Fully qualified Snowpipe name (database.schema.pipe)."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM TABLE(INFORMATION_SCHEMA.COPY_HISTORY(PIPE_NAME => '{{pipe_name}}', START_TIME => DATEADD(hours, -24, CURRENT_TIMESTAMP()))) ORDER BY LAST_LOAD_TIME DESC"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: load_history
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves storage integration status data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Storage Integration Status"
  description: "Retrieves storage integration status data from the Snowflake cloud data platform systems."
  tags:
    - storage
    - snowflake
    - status
capability:
  exposes:
    - type: mcp
      namespace: storage
      port: 8080
      tools:
        - name: storage-integration-status
          description: "Retrieves storage integration status data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.storage-integration-status"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/storage/integration/status/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: storage-integration-status
              method: GET

Checks the offset lag on a Snowflake stream by querying its metadata. Used by data engineers to detect stale change data capture pipelines.

naftiko: "0.5"
info:
  label: "Stream Lag Monitor"
  description: "Checks the offset lag on a Snowflake stream by querying its metadata. Used by data engineers to detect stale change data capture pipelines."
  tags:
    - data-engineering
    - cdc
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-streams
      port: 8080
      tools:
        - name: check-stream-lag
          description: "Query metadata for a Snowflake stream to check stale offset and lag status."
          inputParameters:
            - name: stream_name
              in: body
              type: string
              description: "Fully qualified Snowflake stream name (database.schema.stream)."
          call: "snowflake.submit-statement"
          with:
            statement: "DESCRIBE STREAM {{stream_name}}"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: stream_info
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Retrieves column definitions, data types, and row count for a specified Snowflake table. Useful for data discovery and schema validation.

naftiko: "0.5"
info:
  label: "Table Metadata Inspector"
  description: "Retrieves column definitions, data types, and row count for a specified Snowflake table. Useful for data discovery and schema validation."
  tags:
    - data-governance
    - metadata
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-metadata
      port: 8080
      tools:
        - name: get-table-metadata
          description: "Retrieve column names, types, and row count for a Snowflake table."
          inputParameters:
            - name: database
              in: body
              type: string
              description: "The Snowflake database containing the table."
            - name: schema
              in: body
              type: string
              description: "The schema containing the table."
            - name: table_name
              in: body
              type: string
              description: "The table name to inspect."
          call: "snowflake.submit-statement"
          with:
            statement: "DESCRIBE TABLE {{database}}.{{schema}}.{{table_name}}"
            warehouse: "COMPUTE_WH"
            database: "{{database}}"
            schema: "{{schema}}"
          outputParameters:
            - name: columns
              type: array
              mapping: "$.data"
            - name: query_id
              type: string
              mapping: "$.statementHandle"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Queries Snowflake tag references to report which columns have been classified with sensitivity tags. Used by governance teams to verify data classification coverage.

naftiko: "0.5"
info:
  label: "Tag-Based Data Classification Reporter"
  description: "Queries Snowflake tag references to report which columns have been classified with sensitivity tags. Used by governance teams to verify data classification coverage."
  tags:
    - data-governance
    - classification
    - compliance
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-classification
      port: 8080
      tools:
        - name: get-classification-report
          description: "List all columns tagged with a specific sensitivity classification tag."
          inputParameters:
            - name: tag_name
              in: body
              type: string
              description: "The fully qualified tag name (e.g., GOVERNANCE.TAGS.PII)."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM TABLE(SNOWFLAKE.ACCOUNT_USAGE.TAG_REFERENCES('{{tag_name}}', 'COLUMN')) ORDER BY OBJECT_DATABASE, OBJECT_SCHEMA, OBJECT_NAME"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: tagged_columns
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Checks the execution history and status of a Snowflake task by name. Returns last run time, state, and error messages if any.

naftiko: "0.5"
info:
  label: "Task Execution Status Checker"
  description: "Checks the execution history and status of a Snowflake task by name. Returns last run time, state, and error messages if any."
  tags:
    - data-engineering
    - scheduling
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-tasks
      port: 8080
      tools:
        - name: get-task-status
          description: "Retrieve the execution history for a Snowflake task."
          inputParameters:
            - name: task_name
              in: body
              type: string
              description: "Fully qualified Snowflake task name (database.schema.task)."
          call: "snowflake.submit-statement"
          with:
            statement: "SELECT * FROM TABLE(INFORMATION_SCHEMA.TASK_HISTORY(TASK_NAME => '{{task_name}}')) ORDER BY SCHEDULED_TIME DESC LIMIT 10"
            warehouse: "COMPUTE_WH"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: history
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Reads Snowflake warehouse and database configurations, generates a Terraform state comparison, and opens a GitHub pull request when drift is detected.

naftiko: "0.5"
info:
  label: "Terraform Snowflake Resource Sync"
  description: "Reads Snowflake warehouse and database configurations, generates a Terraform state comparison, and opens a GitHub pull request when drift is detected."
  tags:
    - platform
    - infrastructure
    - snowflake
    - terraform
    - github
capability:
  exposes:
    - type: mcp
      namespace: infra-drift
      port: 8080
      tools:
        - name: detect-snowflake-drift
          description: "Compare live Snowflake resource configuration to Terraform state and create a GitHub PR for drift remediation."
          inputParameters:
            - name: warehouse_name
              in: body
              type: string
              description: "Snowflake warehouse to inspect."
            - name: repo_owner
              in: body
              type: string
              description: "GitHub repository owner."
            - name: repo_name
              in: body
              type: string
              description: "GitHub repository name containing Terraform configs."
            - name: branch_name
              in: body
              type: string
              description: "Branch name for the drift-fix PR."
          steps:
            - name: get-warehouse-config
              type: call
              call: "snowflake.submit-statement"
              with:
                statement: "SHOW WAREHOUSES LIKE '{{warehouse_name}}'"
                warehouse: "ADMIN_WH"
                database: "SNOWFLAKE"
                schema: "ACCOUNT_USAGE"
            - name: get-tf-file
              type: call
              call: "github.get-content"
              with:
                owner: "{{repo_owner}}"
                repo: "{{repo_name}}"
                path: "snowflake/warehouses/{{warehouse_name}}.tf"
            - name: create-pr
              type: call
              call: "github.create-pull-request"
              with:
                owner: "{{repo_owner}}"
                repo: "{{repo_name}}"
                head: "{{branch_name}}"
                base: "main"
                title: "Drift detected: Snowflake warehouse {{warehouse_name}}"
                body: "Live config differs from Terraform. Current warehouse size: {{get-warehouse-config.data[0][3]}}. Review and reconcile."
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST
    - type: http
      namespace: github
      baseUri: "https://api.github.com/repos"
      authentication:
        type: bearer
        token: "$secrets.github_token"
      resources:
        - name: contents
          path: "/{{owner}}/{{repo}}/contents/{{path}}"
          inputParameters:
            - name: owner
              in: path
            - name: repo
              in: path
            - name: path
              in: path
          operations:
            - name: get-content
              method: GET
        - name: pulls
          path: "/{{owner}}/{{repo}}/pulls"
          inputParameters:
            - name: owner
              in: path
            - name: repo
              in: path
          operations:
            - name: create-pull-request
              method: POST

Retrieves warehouse auto suspend config data from the Snowflake cloud data platform systems.

naftiko: "0.5"
info:
  label: "Warehouse Auto Suspend Config"
  description: "Retrieves warehouse auto suspend config data from the Snowflake cloud data platform systems."
  tags:
    - warehouse
    - snowflake
    - config
capability:
  exposes:
    - type: mcp
      namespace: warehouse
      port: 8080
      tools:
        - name: warehouse-auto-suspend-config
          description: "Retrieves warehouse auto suspend config data from the Snowflake cloud data platform systems."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The input id."
          call: "snowflake.warehouse-auto-suspend-config"
          with:
            input_id: "{{input_id}}"
          outputParameters:
            - name: result
              type: string
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: resource
          path: "/warehouse/auto/suspend/config/{{input_id}}"
          inputParameters:
            - name: input_id
              in: path
          operations:
            - name: warehouse-auto-suspend-config
              method: GET

Orchestrates warehouse cost optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders.

naftiko: "0.5"
info:
  label: "Warehouse Cost Optimization Pipeline"
  description: "Orchestrates warehouse cost optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
  tags:
    - warehouse
    - snowflake
    - snowflake
    - salesforce
    - slack
capability:
  exposes:
    - type: mcp
      namespace: warehouse
      port: 8080
      tools:
        - name: warehouse-cost-optimization-pipeline
          description: "Orchestrates warehouse cost optimization pipeline across cloud data platform systems, coordinating multiple services and notifying stakeholders."
          inputParameters:
            - name: input_id
              in: body
              type: string
              description: "The primary input identifier."
          steps:
            - name: step-1
              type: call
              call: "snowflake.execute-1"
              with:
                input: "{{input_id}}"
            - name: step-2
              type: call
              call: "salesforce.execute-2"
              with:
                input: "{{input_id}}"
            - name: step-3
              type: call
              call: "slack.execute-3"
              with:
                input: "{{input_id}}"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://account.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_token"
      resources:
        - name: snowflake-resource
          path: "/api/warehouse"
          operations:
            - name: execute-1
              method: POST
    - type: http
      namespace: salesforce
      baseUri: "https://snowflake.my.salesforce.com/services/data/v58.0"
      authentication:
        type: bearer
        token: "$secrets.salesforce_access_token"
      resources:
        - name: salesforce-resource
          path: "/api/warehouse"
          operations:
            - name: execute-2
              method: POST
    - type: http
      namespace: slack
      baseUri: "https://slack.com/api"
      authentication:
        type: bearer
        token: "$secrets.slack_bot_token"
      resources:
        - name: slack-resource
          path: "/api/warehouse"
          operations:
            - name: execute-3
              method: POST

Executes a SQL statement against a specified Snowflake warehouse and returns the query results. Used by analysts and data engineers for ad-hoc queries without needing a local client.

naftiko: "0.5"
info:
  label: "Warehouse Query Executor"
  description: "Executes a SQL statement against a specified Snowflake warehouse and returns the query results. Used by analysts and data engineers for ad-hoc queries without needing a local client."
  tags:
    - data-warehousing
    - sql
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-query
      port: 8080
      tools:
        - name: execute-query
          description: "Submit a SQL statement to a Snowflake warehouse and return the result set."
          inputParameters:
            - name: warehouse
              in: body
              type: string
              description: "The Snowflake virtual warehouse name to use for compute."
            - name: database
              in: body
              type: string
              description: "The Snowflake database to query against."
            - name: schema
              in: body
              type: string
              description: "The schema within the database."
            - name: statement
              in: body
              type: string
              description: "The SQL statement to execute."
          call: "snowflake.submit-statement"
          with:
            warehouse: "{{warehouse}}"
            database: "{{database}}"
            schema: "{{schema}}"
            statement: "{{statement}}"
          outputParameters:
            - name: query_id
              type: string
              mapping: "$.statementHandle"
            - name: rows
              type: array
              mapping: "$.data"
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST

Resizes a Snowflake virtual warehouse to a specified size. Used by platform teams to scale compute up or down based on workload demand.

naftiko: "0.5"
info:
  label: "Warehouse Scaling Manager"
  description: "Resizes a Snowflake virtual warehouse to a specified size. Used by platform teams to scale compute up or down based on workload demand."
  tags:
    - platform
    - compute
    - snowflake
capability:
  exposes:
    - type: mcp
      namespace: snowflake-warehouse
      port: 8080
      tools:
        - name: resize-warehouse
          description: "Alter a Snowflake warehouse to a new size (e.g., X-Small, Small, Medium, Large)."
          inputParameters:
            - name: warehouse_name
              in: body
              type: string
              description: "The name of the virtual warehouse to resize."
            - name: new_size
              in: body
              type: string
              description: "Target warehouse size (XSMALL, SMALL, MEDIUM, LARGE, XLARGE)."
          call: "snowflake.submit-statement"
          with:
            statement: "ALTER WAREHOUSE {{warehouse_name}} SET WAREHOUSE_SIZE = '{{new_size}}'"
            warehouse: "{{warehouse_name}}"
            database: "SNOWFLAKE"
            schema: "ACCOUNT_USAGE"
          outputParameters:
            - name: status
              type: string
              mapping: "$.status"
  consumes:
    - type: http
      namespace: snowflake
      baseUri: "https://{{account_identifier}}.snowflakecomputing.com/api/v2"
      authentication:
        type: bearer
        token: "$secrets.snowflake_jwt"
      resources:
        - name: statements
          path: "/statements"
          operations:
            - name: submit-statement
              method: POST