BHP Capabilities
Naftiko 0.5 capability definitions for BHP - 100 capabilities showing integration workflows and service orchestrations.
Retrieves metadata for an S3 object containing geological survey data.
naftiko: "0.5"
info:
label: "Amazon S3 Geological Data Retrieval"
description: "Retrieves metadata for an S3 object containing geological survey data."
tags:
- cloud-storage
- amazon-s3
- geology
capability:
exposes:
- type: mcp
namespace: s3-storage
port: 8080
tools:
- name: get-s3-object-info
description: "Look up an S3 object by bucket and key."
inputParameters:
- name: bucket
in: body
type: string
description: "The S3 bucket name."
- name: key
in: body
type: string
description: "The object key."
call: "s3.head-object"
with:
bucket: "{{bucket}}"
key: "{{key}}"
outputParameters:
- name: content_type
type: string
mapping: "$.ContentType"
- name: content_length
type: integer
mapping: "$.ContentLength"
- name: last_modified
type: string
mapping: "$.LastModified"
consumes:
- type: http
namespace: s3
baseUri: "https://{{bucket}}.s3.amazonaws.com"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: objects
path: "/{{key}}"
inputParameters:
- name: bucket
in: path
- name: key
in: path
operations:
- name: head-object
method: HEAD
Ingests autonomous haul truck telemetry from Datadog, detects navigation anomalies via Azure ML, creates incident reports in ServiceNow, and alerts the autonomous systems team via Slack.
naftiko: "0.5"
info:
label: "Autonomous Vehicle Telemetry Pipeline"
description: "Ingests autonomous haul truck telemetry from Datadog, detects navigation anomalies via Azure ML, creates incident reports in ServiceNow, and alerts the autonomous systems team via Slack."
tags:
- autonomous-vehicles
- telemetry
- datadog
- azure-machine-learning
- servicenow
- slack
capability:
exposes:
- type: mcp
namespace: av-telemetry
port: 8080
tools:
- name: monitor-autonomous-fleet
description: "Monitor AV telemetry, detect anomalies, report incidents, and alert team."
inputParameters:
- name: fleet_id
in: body
type: string
description: "The autonomous fleet identifier."
- name: slack_channel
in: body
type: string
description: "Slack channel for AV team."
steps:
- name: get-telemetry
type: call
call: "datadog.query-metrics"
with:
query: "avg:av.navigation.deviation{fleet:{{fleet_id}}} by {truck}"
from: "-2h"
- name: detect-anomalies
type: call
call: "azureml.score"
with:
model_type: "av_navigation_anomaly"
data: "{{get-telemetry.series}}"
- name: create-report
type: call
call: "servicenow.create-incident"
with:
short_description: "AV anomaly: fleet {{fleet_id}}"
description: "Anomalies detected: {{detect-anomalies.anomaly_count}}. Affected trucks: {{detect-anomalies.affected_trucks}}. Max deviation: {{detect-anomalies.max_deviation_m}}m."
category: "autonomous_systems"
- name: alert-team
type: call
call: "slack.post-message"
with:
channel: "{{slack_channel}}"
text: "AV Fleet {{fleet_id}}: {{detect-anomalies.anomaly_count}} navigation anomalies. Max deviation: {{detect-anomalies.max_deviation_m}}m. Incident: {{create-report.number}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: slack
baseUri: "https://slack.com/api"
authentication:
type: bearer
token: "$secrets.slack_bot_token"
resources:
- name: messages
path: "/chat.postMessage"
operations:
- name: post-message
method: POST
Invokes an AWS Lambda function for serverless mining data processing and automation tasks.
naftiko: "0.5"
info:
label: "AWS Lambda Function Invocation"
description: "Invokes an AWS Lambda function for serverless mining data processing and automation tasks."
tags:
- serverless
- aws-lambda
capability:
exposes:
- type: mcp
namespace: aws-lambda
port: 8080
tools:
- name: invoke-function
description: "Invoke an AWS Lambda function by name."
inputParameters:
- name: function_name
in: body
type: string
description: "The Lambda function name or ARN."
- name: payload
in: body
type: string
description: "JSON payload to pass to the function."
call: "lambda.invoke"
with:
function_name: "{{function_name}}"
payload: "{{payload}}"
outputParameters:
- name: status_code
type: integer
mapping: "$.StatusCode"
- name: response_payload
type: string
mapping: "$.Payload"
consumes:
- type: http
namespace: lambda
baseUri: "https://lambda.ap-southeast-2.amazonaws.com/2015-03-31"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: functions
path: "/functions/{{function_name}}/invocations"
inputParameters:
- name: function_name
in: path
operations:
- name: invoke
method: POST
Uploads geological and operational data files to Azure Blob Storage for archival and cross-team sharing.
naftiko: "0.5"
info:
label: "Azure Blob Storage Upload"
description: "Uploads geological and operational data files to Azure Blob Storage for archival and cross-team sharing."
tags:
- cloud-storage
- azure-blob-storage
capability:
exposes:
- type: mcp
namespace: azure-blob
port: 8080
tools:
- name: upload-blob
description: "Upload a file to an Azure Blob Storage container."
inputParameters:
- name: container
in: body
type: string
description: "The Azure Blob container name."
- name: blob_name
in: body
type: string
description: "The destination blob name."
call: "azureblob.put-blob"
with:
container: "{{container}}"
blob_name: "{{blob_name}}"
outputParameters:
- name: url
type: string
mapping: "$.url"
- name: etag
type: string
mapping: "$.etag"
consumes:
- type: http
namespace: azureblob
baseUri: "https://bhp.blob.core.windows.net"
authentication:
type: apiKey
name: "x-ms-access-key"
in: header
value: "$secrets.azure_storage_key"
resources:
- name: blobs
path: "/{{container}}/{{blob_name}}"
inputParameters:
- name: container
in: path
- name: blob_name
in: path
operations:
- name: put-blob
method: PUT
Retrieves the latest build status for a given Azure DevOps pipeline used in mine operations software.
naftiko: "0.5"
info:
label: "Azure DevOps Build Status Lookup"
description: "Retrieves the latest build status for a given Azure DevOps pipeline used in mine operations software."
tags:
- ci
- devops
- azure-devops
capability:
exposes:
- type: mcp
namespace: devops-builds
port: 8080
tools:
- name: get-build-status
description: "Look up the latest Azure DevOps build for a pipeline."
inputParameters:
- name: pipeline_id
in: body
type: string
description: "The Azure DevOps pipeline definition ID."
call: "azuredevops.get-latest-build"
with:
definition_id: "{{pipeline_id}}"
outputParameters:
- name: build_number
type: string
mapping: "$.value[0].buildNumber"
- name: result
type: string
mapping: "$.value[0].result"
- name: start_time
type: string
mapping: "$.value[0].startTime"
consumes:
- type: http
namespace: azuredevops
baseUri: "https://dev.azure.com/bhp/_apis/build"
authentication:
type: bearer
token: "$secrets.azuredevops_pat"
inputParameters:
- name: api-version
in: query
value: "7.0"
resources:
- name: builds
path: "/builds?definitions={{definition_id}}&$top=1&statusFilter=completed"
inputParameters:
- name: definition_id
in: query
operations:
- name: get-latest-build
method: GET
Invokes an Azure ML scoring endpoint for equipment health prediction, returning the predicted failure probability and recommended action.
naftiko: "0.5"
info:
label: "Azure Machine Learning Model Scoring"
description: "Invokes an Azure ML scoring endpoint for equipment health prediction, returning the predicted failure probability and recommended action."
tags:
- machine-learning
- azure-machine-learning
- prediction
- mining
capability:
exposes:
- type: mcp
namespace: ml-scoring
port: 8080
tools:
- name: score-equipment-health
description: "Score equipment health using the Azure ML prediction endpoint."
inputParameters:
- name: equipment_id
in: body
type: string
description: "The equipment identifier."
- name: telemetry_data
in: body
type: string
description: "JSON string of recent telemetry readings."
call: "azureml.score"
with:
equipment_id: "{{equipment_id}}"
data: "{{telemetry_data}}"
outputParameters:
- name: failure_probability
type: string
mapping: "$.predictions[0].failure_probability"
- name: recommended_action
type: string
mapping: "$.predictions[0].recommended_action"
consumes:
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
Retrieves biodiversity survey data from S3, analyzes offset compliance in Snowflake, updates obligation records in ServiceNow, and publishes reports to SharePoint.
naftiko: "0.5"
info:
label: "Biodiversity Offset Tracking Pipeline"
description: "Retrieves biodiversity survey data from S3, analyzes offset compliance in Snowflake, updates obligation records in ServiceNow, and publishes reports to SharePoint."
tags:
- biodiversity
- environmental-offset
- amazon-s3
- snowflake
- servicenow
- sharepoint
capability:
exposes:
- type: mcp
namespace: biodiversity-offset
port: 8080
tools:
- name: track-biodiversity-offsets
description: "Analyze survey data, check compliance, update records, and publish."
inputParameters:
- name: offset_area_id
in: body
type: string
description: "The biodiversity offset area ID."
- name: bucket
in: body
type: string
description: "S3 bucket with survey data."
- name: survey_key
in: body
type: string
description: "S3 key for survey file."
steps:
- name: get-survey
type: call
call: "s3.get-object"
with:
bucket: "{{bucket}}"
key: "{{survey_key}}"
- name: analyze-compliance
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL ANALYZE_BIODIVERSITY_COMPLIANCE('{{offset_area_id}}')"
warehouse: "ENV_WH"
- name: update-obligations
type: call
call: "servicenow.update-record"
with:
table: "u_biodiversity_obligation"
offset_area_id: "{{offset_area_id}}"
compliance_status: "{{analyze-compliance.status}}"
- name: publish-report
type: call
call: "sharepoint.upload-file"
with:
site_id: "env_compliance_site"
folder_path: "Biodiversity/{{offset_area_id}}"
file_name: "biodiversity_report_{{offset_area_id}}.pdf"
consumes:
- type: http
namespace: s3
baseUri: "https://{{bucket}}.s3.amazonaws.com"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: objects
path: "/{{key}}"
inputParameters:
- name: bucket
in: path
- name: key
in: path
operations:
- name: get-object
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: update-record
method: PATCH
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
Retrieves the latest pull request from a Bitbucket repository for mining software code reviews.
naftiko: "0.5"
info:
label: "Bitbucket Code Repository Lookup"
description: "Retrieves the latest pull request from a Bitbucket repository for mining software code reviews."
tags:
- development
- bitbucket
- code-review
capability:
exposes:
- type: mcp
namespace: code-review
port: 8080
tools:
- name: get-latest-pr
description: "Look up the latest pull request for a Bitbucket repo."
inputParameters:
- name: workspace
in: body
type: string
description: "The Bitbucket workspace slug."
- name: repo_slug
in: body
type: string
description: "The repository slug."
call: "bitbucket.get-latest-pr"
with:
workspace: "{{workspace}}"
repo_slug: "{{repo_slug}}"
consumes:
- type: http
namespace: bitbucket
baseUri: "https://api.bitbucket.org/2.0"
authentication:
type: bearer
token: "$secrets.bitbucket_token"
resources:
- name: pull-requests
path: "/repositories/{{workspace}}/{{repo_slug}}/pullrequests?sort=-created_on&pagelen=1"
inputParameters:
- name: workspace
in: path
- name: repo_slug
in: path
operations:
- name: get-latest-pr
method: GET
Retrieves geological data from Snowflake, generates blast pattern recommendations via Azure ML, creates approval requests in ServiceNow, and notifies the drill and blast team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Blast Pattern Design and Approval Pipeline"
description: "Retrieves geological data from Snowflake, generates blast pattern recommendations via Azure ML, creates approval requests in ServiceNow, and notifies the drill and blast team via Microsoft Teams."
tags:
- blasting
- drill-and-blast
- snowflake
- azure-machine-learning
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: blast-pattern
port: 8080
tools:
- name: design-blast-pattern
description: "Generate blast pattern from geology, request approval, and notify team."
inputParameters:
- name: bench_id
in: body
type: string
description: "The mine bench identifier."
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: blast_channel
in: body
type: string
description: "Microsoft Teams drill and blast channel."
steps:
- name: get-geology
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM BENCH_GEOLOGY WHERE bench_id = '{{bench_id}}' AND mine_site = '{{mine_site}}'"
warehouse: "GEOLOGY_WH"
- name: generate-pattern
type: call
call: "azureml.score"
with:
model_type: "blast_pattern"
geology_data: "{{get-geology.results}}"
- name: request-approval
type: call
call: "servicenow.create-record"
with:
table: "u_blast_approval"
bench_id: "{{bench_id}}"
pattern: "{{generate-pattern.pattern_id}}"
burden: "{{generate-pattern.burden_m}}"
spacing: "{{generate-pattern.spacing_m}}"
charge_weight: "{{generate-pattern.charge_kg}}"
- name: notify-team
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{blast_channel}}"
text: "Blast pattern designed for bench {{bench_id}}. Burden: {{generate-pattern.burden_m}}m, Spacing: {{generate-pattern.spacing_m}}m, Charge: {{generate-pattern.charge_kg}}kg. Approval: {{request-approval.sys_id}}"
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: create-record
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves the latest commodity price from Bloomberg Enterprise Data for iron ore, copper, or other mining commodities.
naftiko: "0.5"
info:
label: "Bloomberg Commodity Price Lookup"
description: "Retrieves the latest commodity price from Bloomberg Enterprise Data for iron ore, copper, or other mining commodities."
tags:
- commodities
- bloomberg
- trading
- mining
capability:
exposes:
- type: mcp
namespace: commodity-prices
port: 8080
tools:
- name: get-commodity-price
description: "Look up the latest Bloomberg commodity price by ticker."
inputParameters:
- name: ticker
in: body
type: string
description: "The Bloomberg commodity ticker (e.g., IODBZ00 for iron ore)."
call: "bloomberg.get-price"
with:
ticker: "{{ticker}}"
outputParameters:
- name: last_price
type: string
mapping: "$.data[0].lastPrice"
- name: currency
type: string
mapping: "$.data[0].currency"
- name: timestamp
type: string
mapping: "$.data[0].lastUpdateTime"
consumes:
- type: http
namespace: bloomberg
baseUri: "https://api.bloomberg.com/eap/catalogs/bbg/datasets"
authentication:
type: bearer
token: "$secrets.bloomberg_token"
resources:
- name: prices
path: "/prices?tickers={{ticker}}"
inputParameters:
- name: ticker
in: query
operations:
- name: get-price
method: GET
Retrieves file metadata from Box by file ID for mining documentation access.
naftiko: "0.5"
info:
label: "Box Document Retrieval"
description: "Retrieves file metadata from Box by file ID for mining documentation access."
tags:
- collaboration
- box
- documents
capability:
exposes:
- type: mcp
namespace: cloud-storage
port: 8080
tools:
- name: get-box-file
description: "Look up a Box file by ID."
inputParameters:
- name: file_id
in: body
type: string
description: "The Box file ID."
call: "box.get-file"
with:
file_id: "{{file_id}}"
outputParameters:
- name: name
type: string
mapping: "$.name"
- name: size
type: integer
mapping: "$.size"
- name: owner
type: string
mapping: "$.owned_by.name"
consumes:
- type: http
namespace: box
baseUri: "https://api.box.com/2.0"
authentication:
type: bearer
token: "$secrets.box_token"
resources:
- name: files
path: "/files/{{file_id}}"
inputParameters:
- name: file_id
in: path
operations:
- name: get-file
method: GET
Retrieves the operational status of a Cisco network device at a mine site, returning hostname, uptime, and interface status.
naftiko: "0.5"
info:
label: "Cisco Network Device Status"
description: "Retrieves the operational status of a Cisco network device at a mine site, returning hostname, uptime, and interface status."
tags:
- networking
- cisco
- infrastructure
capability:
exposes:
- type: mcp
namespace: network-ops
port: 8080
tools:
- name: get-device-status
description: "Look up Cisco device status by device ID."
inputParameters:
- name: device_id
in: body
type: string
description: "The Cisco DNA Center device ID."
call: "cisco.get-device"
with:
device_id: "{{device_id}}"
outputParameters:
- name: hostname
type: string
mapping: "$.response.hostname"
- name: uptime
type: string
mapping: "$.response.upTime"
- name: reachability
type: string
mapping: "$.response.reachabilityStatus"
consumes:
- type: http
namespace: cisco
baseUri: "https://bhp-dnac.cisco.com/dna/intent/api/v1"
authentication:
type: bearer
token: "$secrets.cisco_dnac_token"
resources:
- name: devices
path: "/network-device/{{device_id}}"
inputParameters:
- name: device_id
in: path
operations:
- name: get-device
method: GET
Retrieves Cloudflare analytics for BHP public-facing websites including total requests and bandwidth.
naftiko: "0.5"
info:
label: "Cloudflare CDN Performance"
description: "Retrieves Cloudflare analytics for BHP public-facing websites including total requests and bandwidth."
tags:
- networking
- cloudflare
- cdn
capability:
exposes:
- type: mcp
namespace: cdn-analytics
port: 8080
tools:
- name: get-cdn-analytics
description: "Retrieve Cloudflare zone analytics."
inputParameters:
- name: zone_id
in: body
type: string
description: "The Cloudflare zone ID."
- name: since
in: body
type: string
description: "Start time in ISO 8601."
call: "cloudflare.get-analytics"
with:
zone_id: "{{zone_id}}"
since: "{{since}}"
consumes:
- type: http
namespace: cloudflare
baseUri: "https://api.cloudflare.com/client/v4"
authentication:
type: bearer
token: "$secrets.cloudflare_token"
resources:
- name: analytics
path: "/zones/{{zone_id}}/analytics/dashboard?since={{since}}"
inputParameters:
- name: zone_id
in: path
- name: since
in: query
operations:
- name: get-analytics
method: GET
Fetches real-time commodity prices from Bloomberg, compares against hedge positions in Snowflake, generates exposure reports in Power BI, and alerts treasury via Microsoft Teams.
naftiko: "0.5"
info:
label: "Commodity Price Hedging Alert Pipeline"
description: "Fetches real-time commodity prices from Bloomberg, compares against hedge positions in Snowflake, generates exposure reports in Power BI, and alerts treasury via Microsoft Teams."
tags:
- commodity-trading
- hedging
- bloomberg
- snowflake
- power-bi
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: commodity-hedging
port: 8080
tools:
- name: check-hedging-exposure
description: "Check commodity prices against hedges, generate reports, and alert treasury."
inputParameters:
- name: commodity
in: body
type: string
description: "The commodity code (e.g., IRON_ORE, COPPER)."
- name: dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: group_id
in: body
type: string
description: "Power BI workspace ID."
- name: treasury_channel
in: body
type: string
description: "Microsoft Teams treasury channel."
steps:
- name: get-price
type: call
call: "bloomberg.get-market-data"
with:
security: "{{commodity}}"
- name: check-positions
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL CHECK_HEDGE_POSITIONS('{{commodity}}', {{get-price.last_price}})"
warehouse: "TREASURY_WH"
- name: refresh-report
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
- name: alert-treasury
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{treasury_channel}}"
text: "{{commodity}} price: ${{get-price.last_price}}. Hedge coverage: {{check-positions.coverage_pct}}%. MTM: ${{check-positions.mark_to_market}}M. Dashboard refreshed."
consumes:
- type: http
namespace: bloomberg
baseUri: "https://bsapi.bloomberg.com/eap/v1"
authentication:
type: bearer
token: "$secrets.bloomberg_token"
resources:
- name: market-data
path: "/marketdata/snapshots"
operations:
- name: get-market-data
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves concentrate assay results from Snowflake, validates against customer specs via Azure ML, updates shipping records in SAP, and notifies marketing via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Concentrate Quality Control Pipeline"
description: "Retrieves concentrate assay results from Snowflake, validates against customer specs via Azure ML, updates shipping records in SAP, and notifies marketing via Microsoft Outlook."
tags:
- quality-control
- concentrate
- snowflake
- azure-machine-learning
- sap
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: concentrate-qc
port: 8080
tools:
- name: check-concentrate-quality
description: "Validate concentrate quality against specs and update shipping."
inputParameters:
- name: shipment_id
in: body
type: string
description: "The shipment identifier."
- name: marketing_email
in: body
type: string
description: "Marketing team email."
steps:
- name: get-assays
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM CONCENTRATE_ASSAYS WHERE shipment_id = '{{shipment_id}}'"
warehouse: "QUALITY_WH"
- name: validate-specs
type: call
call: "azureml.score"
with:
model_type: "concentrate_spec_check"
data: "{{get-assays.results}}"
- name: update-shipping
type: call
call: "sap.update-shipment"
with:
shipment_id: "{{shipment_id}}"
quality_status: "{{validate-specs.status}}"
- name: notify-marketing
type: call
call: "outlook.send-mail"
with:
to: "{{marketing_email}}"
subject: "Concentrate QC: Shipment {{shipment_id}}"
body: "Quality: {{validate-specs.status}}. Grade: {{validate-specs.grade_pct}}%. Penalty elements within spec: {{validate-specs.penalty_status}}."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_SHIPMENT_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: shipments
path: "/A_Shipment('{{shipment_id}}')"
inputParameters:
- name: shipment_id
in: path
operations:
- name: update-shipment
method: PATCH
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Retrieves a Confluence wiki page by ID, returning mining operations documentation and standard procedures.
naftiko: "0.5"
info:
label: "Confluence Knowledge Base Lookup"
description: "Retrieves a Confluence wiki page by ID, returning mining operations documentation and standard procedures."
tags:
- documentation
- confluence
capability:
exposes:
- type: mcp
namespace: confluence
port: 8080
tools:
- name: get-page
description: "Retrieve a Confluence page by its ID."
inputParameters:
- name: page_id
in: body
type: string
description: "The Confluence page identifier."
call: "confluence.get-page-content"
with:
page_id: "{{page_id}}"
outputParameters:
- name: title
type: string
mapping: "$.title"
- name: body
type: string
mapping: "$.body.storage.value"
- name: version
type: integer
mapping: "$.version.number"
consumes:
- type: http
namespace: confluence
baseUri: "https://bhp.atlassian.net/wiki/rest/api"
authentication:
type: basic
username: "$secrets.confluence_user"
password: "$secrets.confluence_api_token"
resources:
- name: pages
path: "/content/{{page_id}}?expand=body.storage,version"
inputParameters:
- name: page_id
in: path
operations:
- name: get-page-content
method: GET
Checks contractor induction status in Workday, validates site access in ServiceNow, updates SAP contractor records, and notifies the site manager via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Contractor Induction Tracking Pipeline"
description: "Checks contractor induction status in Workday, validates site access in ServiceNow, updates SAP contractor records, and notifies the site manager via Microsoft Outlook."
tags:
- contractor-management
- induction
- workday
- servicenow
- sap
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: contractor-induction
port: 8080
tools:
- name: track-induction
description: "Check induction status, validate access, update records, and notify."
inputParameters:
- name: contractor_id
in: body
type: string
description: "The contractor worker ID."
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: site_manager_email
in: body
type: string
description: "Site manager email."
steps:
- name: check-induction
type: call
call: "workday.get-worker"
with:
worker_id: "{{contractor_id}}"
- name: validate-access
type: call
call: "servicenow.get-records"
with:
table: "u_site_access"
query: "contractor_id={{contractor_id}}^mine_site={{mine_site}}"
- name: update-records
type: call
call: "sap.update-contractor"
with:
contractor_id: "{{contractor_id}}"
induction_status: "{{check-induction.induction_status}}"
access_status: "{{validate-access.access_status}}"
- name: notify-manager
type: call
call: "outlook.send-mail"
with:
to: "{{site_manager_email}}"
subject: "Contractor Induction: {{check-induction.full_name}} — {{mine_site}}"
body: "Induction: {{check-induction.induction_status}}. Site access: {{validate-access.access_status}}. SAP records updated."
consumes:
- type: http
namespace: workday
baseUri: "https://wd5-services1.myworkday.com/ccx/api/v1/bhp"
authentication:
type: bearer
token: "$secrets.workday_token"
resources:
- name: workers
path: "/workers/{{worker_id}}"
inputParameters:
- name: worker_id
in: path
operations:
- name: get-worker
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: get-records
method: GET
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_CONTRACTOR_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: contractors
path: "/A_Contractor('{{contractor_id}}')"
inputParameters:
- name: contractor_id
in: path
operations:
- name: update-contractor
method: PATCH
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Verifies contractor safety certifications via Workday, checks compliance in ServiceNow, grants site access in Oracle EBS, and notifies the site manager via Microsoft Teams.
naftiko: "0.5"
info:
label: "Contractor Safety Certification Pipeline"
description: "Verifies contractor safety certifications via Workday, checks compliance in ServiceNow, grants site access in Oracle EBS, and notifies the site manager via Microsoft Teams."
tags:
- safety
- contractor
- workday
- servicenow
- oracle-e-business-suite
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: contractor-safety
port: 8080
tools:
- name: verify-contractor-certification
description: "Given a contractor ID and mine site, verify safety certifications and grant access."
inputParameters:
- name: contractor_id
in: body
type: string
description: "The Workday contractor ID."
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: site_manager_email
in: body
type: string
description: "Email of the site manager."
steps:
- name: get-contractor
type: call
call: "workday.get-worker"
with:
worker_id: "{{contractor_id}}"
- name: check-compliance
type: call
call: "servicenow.check-compliance"
with:
worker_email: "{{get-contractor.work_email}}"
site: "{{mine_site}}"
- name: grant-access
type: call
call: "oracle-ebs.grant-site-access"
with:
worker_id: "{{contractor_id}}"
mine_site: "{{mine_site}}"
certification_status: "{{check-compliance.status}}"
- name: notify-manager
type: call
call: "msteams.send-message"
with:
recipient_upn: "{{site_manager_email}}"
text: "Contractor {{get-contractor.full_name}} certification verified for {{mine_site}}. Status: {{check-compliance.status}}. Access: {{grant-access.access_status}}."
consumes:
- type: http
namespace: workday
baseUri: "https://wd2-impl-services1.workday.com/ccx/api/v1"
authentication:
type: bearer
token: "$secrets.workday_token"
resources:
- name: workers
path: "/workers/{{worker_id}}"
inputParameters:
- name: worker_id
in: path
operations:
- name: get-worker
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: compliance
path: "/table/compliance_check"
operations:
- name: check-compliance
method: POST
- type: http
namespace: oracle-ebs
baseUri: "https://bhp-ebs.oraclecloud.com/webservices/rest/v1"
authentication:
type: bearer
token: "$secrets.oracle_ebs_token"
resources:
- name: site-access
path: "/site-access"
operations:
- name: grant-site-access
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: messages
path: "/users/{{recipient_upn}}/sendMail"
inputParameters:
- name: recipient_upn
in: path
operations:
- name: send-message
method: POST
Monitors conveyor belt sensor data from Datadog, predicts splice and belt failures via Azure ML, logs findings in ServiceNow, and alerts operations via Slack.
naftiko: "0.5"
info:
label: "Conveyor Belt Health Monitoring Pipeline"
description: "Monitors conveyor belt sensor data from Datadog, predicts splice and belt failures via Azure ML, logs findings in ServiceNow, and alerts operations via Slack."
tags:
- conveyor-systems
- predictive-maintenance
- datadog
- azure-machine-learning
- servicenow
- slack
capability:
exposes:
- type: mcp
namespace: conveyor-health
port: 8080
tools:
- name: monitor-conveyor
description: "Monitor conveyor health, predict failures, log findings, and alert."
inputParameters:
- name: conveyor_id
in: body
type: string
description: "The conveyor belt identifier."
- name: slack_channel
in: body
type: string
description: "Slack channel for operations alerts."
steps:
- name: get-belt-data
type: call
call: "datadog.query-metrics"
with:
query: "avg:conveyor.tension{conveyor:{{conveyor_id}}} by {sensor}"
from: "-8h"
- name: predict-failure
type: call
call: "azureml.score"
with:
model_type: "conveyor_failure"
data: "{{get-belt-data.series}}"
- name: log-finding
type: call
call: "servicenow.create-incident"
with:
short_description: "Conveyor {{conveyor_id}}: {{predict-failure.failure_type}} predicted"
description: "Failure probability: {{predict-failure.failure_probability}}%. Remaining life: {{predict-failure.remaining_hours}}h. Component: {{predict-failure.component}}."
urgency: "2"
- name: alert-ops
type: call
call: "slack.post-message"
with:
channel: "{{slack_channel}}"
text: "CONVEYOR ALERT: {{conveyor_id}} — {{predict-failure.failure_type}} predicted. Probability: {{predict-failure.failure_probability}}%. Incident: {{log-finding.number}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: slack
baseUri: "https://slack.com/api"
authentication:
type: bearer
token: "$secrets.slack_bot_token"
resources:
- name: messages
path: "/chat.postMessage"
operations:
- name: post-message
method: POST
Monitors crusher vibration and throughput from Datadog, predicts liner wear via Azure ML, schedules replacement in SAP PM, and notifies the maintenance team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Crusher Performance Optimization Pipeline"
description: "Monitors crusher vibration and throughput from Datadog, predicts liner wear via Azure ML, schedules replacement in SAP PM, and notifies the maintenance team via Microsoft Teams."
tags:
- crushing
- predictive-maintenance
- datadog
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: crusher-optimization
port: 8080
tools:
- name: optimize-crusher
description: "Monitor crusher, predict wear, schedule replacement, and notify."
inputParameters:
- name: crusher_id
in: body
type: string
description: "The crusher equipment ID."
- name: maint_channel
in: body
type: string
description: "Microsoft Teams maintenance channel."
steps:
- name: get-performance
type: call
call: "datadog.query-metrics"
with:
query: "avg:crusher.vibration{crusher:{{crusher_id}}} by {axis}"
from: "-12h"
- name: predict-wear
type: call
call: "azureml.score"
with:
model_type: "crusher_liner_wear"
data: "{{get-performance.series}}"
- name: schedule-replacement
type: call
call: "sap.create-work-order"
with:
order_type: "PREVENTIVE"
equipment_id: "{{crusher_id}}"
description: "Liner replacement — remaining life: {{predict-wear.remaining_life_hours}}h"
- name: notify-maintenance
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{maint_channel}}"
text: "Crusher {{crusher_id}}: Liner remaining life {{predict-wear.remaining_life_hours}}h. Wear rate: {{predict-wear.wear_rate_mm_h}}mm/h. WO: {{schedule-replacement.order_id}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: work-orders
path: "/MaintenanceOrder"
operations:
- name: create-work-order
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Scans Docker container images for vulnerabilities using Snyk for mining operations software deployments.
naftiko: "0.5"
info:
label: "Docker Image Vulnerability Scan"
description: "Scans Docker container images for vulnerabilities using Snyk for mining operations software deployments."
tags:
- security
- snyk
capability:
exposes:
- type: mcp
namespace: container-security
port: 8080
tools:
- name: scan-image
description: "Scan a Docker image for vulnerabilities."
inputParameters:
- name: image
in: body
type: string
description: "The Docker image reference."
call: "snyk.test-image"
with:
image: "{{image}}"
outputParameters:
- name: vulnerability_count
type: integer
mapping: "$.summary.totalVulnerabilities"
- name: critical_count
type: integer
mapping: "$.summary.criticalVulnerabilities"
consumes:
- type: http
namespace: snyk
baseUri: "https://api.snyk.io/v1"
authentication:
type: bearer
token: "$secrets.snyk_token"
resources:
- name: test
path: "/test/docker"
operations:
- name: test-image
method: POST
Retrieves assay results from Snowflake, validates against lab QC standards via Azure ML, logs discrepancies in Jira, and notifies the geology team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Drill Core Assay Validation Pipeline"
description: "Retrieves assay results from Snowflake, validates against lab QC standards via Azure ML, logs discrepancies in Jira, and notifies the geology team via Microsoft Teams."
tags:
- geology
- assay
- snowflake
- azure-machine-learning
- jira
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: assay-validation
port: 8080
tools:
- name: validate-assays
description: "Validate drill core assays and flag discrepancies."
inputParameters:
- name: drill_hole_id
in: body
type: string
description: "The drill hole identifier."
- name: project_key
in: body
type: string
description: "Jira project key."
- name: geology_channel
in: body
type: string
description: "Microsoft Teams geology channel."
steps:
- name: get-assays
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM ASSAY_RESULTS WHERE drill_hole_id = '{{drill_hole_id}}' ORDER BY depth_from"
warehouse: "GEOLOGY_WH"
- name: validate-qc
type: call
call: "azureml.score"
with:
model_type: "assay_qc_validation"
data: "{{get-assays.results}}"
- name: log-discrepancies
type: call
call: "jira.create-issue"
with:
project_key: "{{project_key}}"
summary: "Assay QC: {{validate-qc.discrepancy_count}} discrepancies in {{drill_hole_id}}"
description: "Samples: {{validate-qc.total_samples}}. Discrepancies: {{validate-qc.discrepancy_count}}. Max variance: {{validate-qc.max_variance_pct}}%."
issue_type: "Bug"
- name: notify-geology
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{geology_channel}}"
text: "Assay validation for {{drill_hole_id}}: {{validate-qc.discrepancy_count}} QC discrepancies out of {{validate-qc.total_samples}} samples. Jira: {{log-discrepancies.key}}"
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: jira
baseUri: "https://bhp.atlassian.net/rest/api/3"
authentication:
type: basic
username: "$secrets.jira_user"
password: "$secrets.jira_api_token"
resources:
- name: issues
path: "/issue"
operations:
- name: create-issue
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Collects air quality sensor data from Datadog, predicts dust levels via Azure ML, triggers suppression systems via SAP, and reports compliance to regulators via SharePoint.
naftiko: "0.5"
info:
label: "Dust Suppression Monitoring Pipeline"
description: "Collects air quality sensor data from Datadog, predicts dust levels via Azure ML, triggers suppression systems via SAP, and reports compliance to regulators via SharePoint."
tags:
- dust-management
- air-quality
- datadog
- azure-machine-learning
- sap
- sharepoint
capability:
exposes:
- type: mcp
namespace: dust-suppression
port: 8080
tools:
- name: monitor-dust
description: "Monitor dust levels, predict exposure, trigger suppression, and report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
steps:
- name: get-air-quality
type: call
call: "datadog.query-metrics"
with:
query: "avg:mine.dust.pm10{site:{{mine_site}}} by {monitor}"
from: "-4h"
- name: predict-dust
type: call
call: "azureml.score"
with:
model_type: "dust_prediction"
data: "{{get-air-quality.series}}"
- name: trigger-suppression
type: call
call: "sap.create-work-order"
with:
order_type: "DUST_SUPPRESSION"
mine_site: "{{mine_site}}"
description: "Dust suppression triggered. PM10: {{predict-dust.current_pm10}} ug/m3. Forecast: {{predict-dust.forecast_pm10}} ug/m3."
- name: upload-compliance
type: call
call: "sharepoint.upload-file"
with:
site_id: "env_compliance_site"
folder_path: "DustReports/{{mine_site}}"
file_name: "dust_report_{{mine_site}}.pdf"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: work-orders
path: "/MaintenanceOrder"
operations:
- name: create-work-order
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
Retrieves an active Dynatrace problem by ID for BHP mine IT infrastructure monitoring.
naftiko: "0.5"
info:
label: "Dynatrace Infrastructure Monitor"
description: "Retrieves an active Dynatrace problem by ID for BHP mine IT infrastructure monitoring."
tags:
- monitoring
- dynatrace
- infrastructure
capability:
exposes:
- type: mcp
namespace: infra-monitoring
port: 8080
tools:
- name: get-dynatrace-problem
description: "Look up a Dynatrace problem by ID."
inputParameters:
- name: problem_id
in: body
type: string
description: "The Dynatrace problem ID."
call: "dynatrace.get-problem"
with:
problem_id: "{{problem_id}}"
outputParameters:
- name: title
type: string
mapping: "$.title"
- name: severity
type: string
mapping: "$.severityLevel"
- name: impact
type: string
mapping: "$.impactLevel"
consumes:
- type: http
namespace: dynatrace
baseUri: "https://bhp.live.dynatrace.com/api/v2"
authentication:
type: bearer
token: "$secrets.dynatrace_token"
resources:
- name: problems
path: "/problems/{{problem_id}}"
inputParameters:
- name: problem_id
in: path
operations:
- name: get-problem
method: GET
Queries Elasticsearch for mine operations logs, returning matching records for incident investigation and process analysis.
naftiko: "0.5"
info:
label: "Elastic Search Operations Log Query"
description: "Queries Elasticsearch for mine operations logs, returning matching records for incident investigation and process analysis."
tags:
- search
- elasticsearch
capability:
exposes:
- type: mcp
namespace: elasticsearch
port: 8080
tools:
- name: search-ops-logs
description: "Search mine operations logs in Elasticsearch."
inputParameters:
- name: index
in: body
type: string
description: "The Elasticsearch index name."
- name: query
in: body
type: string
description: "The search query string."
call: "elasticsearch.search"
with:
index: "{{index}}"
q: "{{query}}"
outputParameters:
- name: total_hits
type: integer
mapping: "$.hits.total.value"
- name: results
type: array
mapping: "$.hits.hits"
consumes:
- type: http
namespace: elasticsearch
baseUri: "https://elasticsearch.bhp.com:9200"
authentication:
type: basic
username: "$secrets.elasticsearch_user"
password: "$secrets.elasticsearch_password"
resources:
- name: search
path: "/{{index}}/_search"
inputParameters:
- name: index
in: path
operations:
- name: search
method: GET
Collects energy consumption from Datadog, benchmarks efficiency via Snowflake, updates SAP cost centers, and reports to operations via Power BI.
naftiko: "0.5"
info:
label: "Energy Efficiency Monitoring Pipeline"
description: "Collects energy consumption from Datadog, benchmarks efficiency via Snowflake, updates SAP cost centers, and reports to operations via Power BI."
tags:
- energy
- efficiency
- datadog
- snowflake
- sap
- power-bi
capability:
exposes:
- type: mcp
namespace: energy-efficiency
port: 8080
tools:
- name: monitor-energy-efficiency
description: "Monitor energy efficiency, benchmark, update costs, and report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: group_id
in: body
type: string
description: "Power BI workspace ID."
steps:
- name: get-consumption
type: call
call: "datadog.query-metrics"
with:
query: "sum:mine.energy.kwh{site:{{mine_site}}} by {area}"
from: "-24h"
- name: benchmark
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL BENCHMARK_ENERGY_EFFICIENCY('{{mine_site}}')"
warehouse: "ENERGY_WH"
- name: update-costs
type: call
call: "sap.update-cost-center"
with:
mine_site: "{{mine_site}}"
energy_cost: "{{benchmark.total_energy_cost}}"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_COSTCENTER_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: cost-centers
path: "/A_CostCenter"
operations:
- name: update-cost-center
method: PATCH
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
Collects environmental sensor data from Azure IoT, validates against compliance thresholds in SAP BW, logs exceptions in ServiceNow, and alerts the environment team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Environmental Compliance Monitoring Pipeline"
description: "Collects environmental sensor data from Azure IoT, validates against compliance thresholds in SAP BW, logs exceptions in ServiceNow, and alerts the environment team via Microsoft Teams."
tags:
- environment
- compliance
- azure
- sap-bw
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: env-compliance
port: 8080
tools:
- name: check-environmental-compliance
description: "Given a mine site and sensor type, check environmental compliance and alert on exceptions."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site location code."
- name: sensor_type
in: body
type: string
description: "Type of environmental sensor (e.g., air_quality, water_ph, dust_level)."
- name: threshold_id
in: body
type: string
description: "The SAP BW threshold configuration ID."
- name: env_team_channel
in: body
type: string
description: "Microsoft Teams channel for environment team."
steps:
- name: get-sensor-data
type: call
call: "azure-iot.get-telemetry"
with:
device_group: "{{mine_site}}_{{sensor_type}}"
- name: check-threshold
type: call
call: "sap-bw.check-threshold"
with:
threshold_id: "{{threshold_id}}"
value: "{{get-sensor-data.latest_value}}"
- name: log-exception
type: call
call: "servicenow.create-incident"
with:
short_description: "Environmental compliance check: {{mine_site}} {{sensor_type}}"
category: "environmental"
description: "Sensor reading: {{get-sensor-data.latest_value}}. Threshold status: {{check-threshold.status}}."
- name: alert-team
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{env_team_channel}}"
text: "Environmental check at {{mine_site}}: {{sensor_type}} reading {{get-sensor-data.latest_value}}. Status: {{check-threshold.status}}. ServiceNow: {{log-exception.number}}."
consumes:
- type: http
namespace: azure-iot
baseUri: "https://bhp-iot.azure-devices.net"
authentication:
type: bearer
token: "$secrets.azure_iot_token"
resources:
- name: telemetry
path: "/devices/{{device_group}}/telemetry/latest"
inputParameters:
- name: device_group
in: path
operations:
- name: get-telemetry
method: GET
- type: http
namespace: sap-bw
baseUri: "https://bhp-bw.sap.com/sap/opu/odata/sap/API_THRESHOLD_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: thresholds
path: "/A_Threshold('{{threshold_id}}')/check"
inputParameters:
- name: threshold_id
in: path
operations:
- name: check-threshold
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Pulls equipment downtime events from SAP PM, analyzes patterns in Snowflake, generates a Power BI report, and notifies maintenance leadership via Microsoft Teams.
naftiko: "0.5"
info:
label: "Equipment Downtime Analysis Pipeline"
description: "Pulls equipment downtime events from SAP PM, analyzes patterns in Snowflake, generates a Power BI report, and notifies maintenance leadership via Microsoft Teams."
tags:
- maintenance
- analytics
- sap
- snowflake
- power-bi
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: downtime-analysis
port: 8080
tools:
- name: analyze-equipment-downtime
description: "Given an equipment ID and date range, analyze downtime patterns and distribute insights."
inputParameters:
- name: equipment_id
in: body
type: string
description: "The SAP equipment ID."
- name: start_date
in: body
type: string
description: "Analysis start date (YYYY-MM-DD)."
- name: end_date
in: body
type: string
description: "Analysis end date (YYYY-MM-DD)."
- name: bi_dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: bi_group_id
in: body
type: string
description: "Power BI workspace ID."
- name: maintenance_channel
in: body
type: string
description: "Microsoft Teams channel for maintenance leadership."
steps:
- name: get-downtime-events
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT event_type, duration_hours, root_cause FROM EQUIPMENT_DOWNTIME WHERE equipment_id = '{{equipment_id}}' AND event_date BETWEEN '{{start_date}}' AND '{{end_date}}' ORDER BY duration_hours DESC"
warehouse: "MAINTENANCE_WH"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{bi_group_id}}"
dataset_id: "{{bi_dataset_id}}"
- name: notify-leadership
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{maintenance_channel}}"
text: "Downtime analysis for equipment {{equipment_id}} ({{start_date}} to {{end_date}}): Total events found. Dashboard refreshed."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves geophysical survey data from S3, ranks exploration targets via Azure ML, creates drill programs in Jira, and distributes findings via SharePoint.
naftiko: "0.5"
info:
label: "Exploration Target Prioritization Pipeline"
description: "Retrieves geophysical survey data from S3, ranks exploration targets via Azure ML, creates drill programs in Jira, and distributes findings via SharePoint."
tags:
- exploration
- geophysics
- amazon-s3
- azure-machine-learning
- jira
- sharepoint
capability:
exposes:
- type: mcp
namespace: exploration-targeting
port: 8080
tools:
- name: prioritize-targets
description: "Rank exploration targets, create drill programs, and distribute findings."
inputParameters:
- name: survey_bucket
in: body
type: string
description: "S3 bucket with geophysical data."
- name: survey_key
in: body
type: string
description: "S3 key for survey data."
- name: project_key
in: body
type: string
description: "Jira project key."
steps:
- name: get-survey
type: call
call: "s3.get-object"
with:
bucket: "{{survey_bucket}}"
key: "{{survey_key}}"
- name: rank-targets
type: call
call: "azureml.score"
with:
model_type: "exploration_targeting"
data: "{{get-survey.body}}"
- name: create-drill-program
type: call
call: "jira.create-issue"
with:
project_key: "{{project_key}}"
summary: "Drill Program: {{rank-targets.top_target}} — Score {{rank-targets.top_score}}"
description: "{{rank-targets.target_count}} targets ranked. Top: {{rank-targets.top_target}} ({{rank-targets.top_score}}). Estimated resource: {{rank-targets.estimated_tonnes}}Mt."
issue_type: "Epic"
- name: publish-findings
type: call
call: "sharepoint.upload-file"
with:
site_id: "exploration_site"
folder_path: "Targeting/{{rank-targets.top_target}}"
file_name: "target_ranking.pdf"
consumes:
- type: http
namespace: s3
baseUri: "https://{{survey_bucket}}.s3.amazonaws.com"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: objects
path: "/{{key}}"
inputParameters:
- name: survey_bucket
in: path
- name: key
in: path
operations:
- name: get-object
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: jira
baseUri: "https://bhp.atlassian.net/rest/api/3"
authentication:
type: basic
username: "$secrets.jira_user"
password: "$secrets.jira_api_token"
resources:
- name: issues
path: "/issue"
operations:
- name: create-issue
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
Collects operator fatigue detection data from Datadog, analyzes risk patterns via Azure ML, creates safety reports in ServiceNow, and alerts the HSEC team via PagerDuty.
naftiko: "0.5"
info:
label: "Fatigue Management Monitoring Pipeline"
description: "Collects operator fatigue detection data from Datadog, analyzes risk patterns via Azure ML, creates safety reports in ServiceNow, and alerts the HSEC team via PagerDuty."
tags:
- safety
- fatigue-management
- datadog
- azure-machine-learning
- servicenow
- pagerduty
capability:
exposes:
- type: mcp
namespace: fatigue-management
port: 8080
tools:
- name: monitor-fatigue
description: "Monitor operator fatigue, analyze risk, report safety issues, and alert HSEC."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: pagerduty_service
in: body
type: string
description: "PagerDuty service for HSEC."
steps:
- name: get-fatigue-data
type: call
call: "datadog.query-metrics"
with:
query: "sum:fatigue.events{site:{{mine_site}}} by {operator}"
from: "-8h"
- name: analyze-risk
type: call
call: "azureml.score"
with:
model_type: "fatigue_risk"
data: "{{get-fatigue-data.series}}"
- name: create-report
type: call
call: "servicenow.create-incident"
with:
short_description: "Fatigue risk: {{mine_site}} — {{analyze-risk.high_risk_count}} operators"
description: "High risk operators: {{analyze-risk.high_risk_count}}. Events this shift: {{analyze-risk.total_events}}. Highest risk score: {{analyze-risk.max_risk_score}}."
urgency: "1"
category: "safety"
- name: alert-hsec
type: call
call: "pagerduty.create-incident"
with:
service_id: "{{pagerduty_service}}"
title: "Fatigue alert: {{mine_site}} — {{analyze-risk.high_risk_count}} high-risk operators"
urgency: "high"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: pagerduty
baseUri: "https://api.pagerduty.com"
authentication:
type: bearer
token: "$secrets.pagerduty_token"
resources:
- name: incidents
path: "/incidents"
operations:
- name: create-incident
method: POST
Logs sample submissions in ServiceNow, tracks lab turnaround in Snowflake, validates results via Azure ML, and distributes findings via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Geochemical Sampling Workflow"
description: "Logs sample submissions in ServiceNow, tracks lab turnaround in Snowflake, validates results via Azure ML, and distributes findings via Microsoft Outlook."
tags:
- geochemistry
- sampling
- servicenow
- snowflake
- azure-machine-learning
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: geochemical-sampling
port: 8080
tools:
- name: manage-sampling
description: "Log samples, track turnaround, validate results, and distribute."
inputParameters:
- name: batch_id
in: body
type: string
description: "The sample batch identifier."
- name: geologist_email
in: body
type: string
description: "Geologist email for results."
steps:
- name: get-submissions
type: call
call: "servicenow.get-records"
with:
table: "u_sample_submission"
query: "batch_id={{batch_id}}"
- name: track-turnaround
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM LAB_RESULTS WHERE batch_id = '{{batch_id}}'"
warehouse: "GEOLOGY_WH"
- name: validate-results
type: call
call: "azureml.score"
with:
model_type: "assay_validation"
data: "{{track-turnaround.results}}"
- name: distribute-results
type: call
call: "outlook.send-mail"
with:
to: "{{geologist_email}}"
subject: "Lab Results: Batch {{batch_id}}"
body: "Samples: {{get-submissions.count}}. Results received: {{track-turnaround.rowCount}}. QC status: {{validate-results.qc_status}}. Avg grade: {{validate-results.avg_grade}}%."
consumes:
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: get-records
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Retrieves the latest GitHub Actions workflow run status for BHP open-source mining tools.
naftiko: "0.5"
info:
label: "GitHub Actions Workflow Status"
description: "Retrieves the latest GitHub Actions workflow run status for BHP open-source mining tools."
tags:
- ci
- github-actions
- development
capability:
exposes:
- type: mcp
namespace: github-ci
port: 8080
tools:
- name: get-workflow-run
description: "Look up the latest GitHub Actions workflow run."
inputParameters:
- name: repo
in: body
type: string
description: "The repository (owner/name)."
- name: workflow_id
in: body
type: string
description: "The workflow ID or filename."
call: "github.get-workflow-runs"
with:
repo: "{{repo}}"
workflow_id: "{{workflow_id}}"
consumes:
- type: http
namespace: github
baseUri: "https://api.github.com"
authentication:
type: bearer
token: "$secrets.github_token"
resources:
- name: workflow-runs
path: "/repos/{{repo}}/actions/workflows/{{workflow_id}}/runs?per_page=1"
inputParameters:
- name: repo
in: path
- name: workflow_id
in: path
operations:
- name: get-workflow-runs
method: GET
Retrieves the latest GitLab CI/CD pipeline status for mine automation software projects.
naftiko: "0.5"
info:
label: "GitLab Pipeline Status"
description: "Retrieves the latest GitLab CI/CD pipeline status for mine automation software projects."
tags:
- ci
- gitlab
- pipeline
capability:
exposes:
- type: mcp
namespace: ci-pipelines
port: 8080
tools:
- name: get-pipeline-status
description: "Look up the latest GitLab pipeline for a project."
inputParameters:
- name: project_id
in: body
type: string
description: "The GitLab project ID."
call: "gitlab.get-latest-pipeline"
with:
project_id: "{{project_id}}"
outputParameters:
- name: pipeline_id
type: integer
mapping: "$[0].id"
- name: status
type: string
mapping: "$[0].status"
- name: ref
type: string
mapping: "$[0].ref"
consumes:
- type: http
namespace: gitlab
baseUri: "https://gitlab.bhp.com/api/v4"
authentication:
type: bearer
token: "$secrets.gitlab_token"
resources:
- name: pipelines
path: "/projects/{{project_id}}/pipelines?per_page=1"
inputParameters:
- name: project_id
in: path
operations:
- name: get-latest-pipeline
method: GET
Retrieves geolocation coordinates and place details for a BHP mine site address using Google Maps Geocoding API.
naftiko: "0.5"
info:
label: "Google Maps Mine Site Geolocation"
description: "Retrieves geolocation coordinates and place details for a BHP mine site address using Google Maps Geocoding API."
tags:
- geospatial
- google-maps
- mining
capability:
exposes:
- type: mcp
namespace: geolocation
port: 8080
tools:
- name: geocode-mine-site
description: "Geocode a mine site address using Google Maps."
inputParameters:
- name: address
in: body
type: string
description: "The mine site address to geocode."
call: "googlemaps.geocode"
with:
address: "{{address}}"
outputParameters:
- name: latitude
type: string
mapping: "$.results[0].geometry.location.lat"
- name: longitude
type: string
mapping: "$.results[0].geometry.location.lng"
- name: formatted_address
type: string
mapping: "$.results[0].formatted_address"
consumes:
- type: http
namespace: googlemaps
baseUri: "https://maps.googleapis.com/maps/api"
authentication:
type: apiKey
key: "$secrets.google_maps_api_key"
resources:
- name: geocode
path: "/geocode/json?address={{address}}&key=$secrets.google_maps_api_key"
inputParameters:
- name: address
in: query
operations:
- name: geocode
method: GET
Creates a shareable Grafana dashboard snapshot for mine site operational metrics.
naftiko: "0.5"
info:
label: "Grafana Dashboard Snapshot"
description: "Creates a shareable Grafana dashboard snapshot for mine site operational metrics."
tags:
- monitoring
- grafana
capability:
exposes:
- type: mcp
namespace: grafana
port: 8080
tools:
- name: create-snapshot
description: "Create a shareable snapshot of a Grafana dashboard."
inputParameters:
- name: dashboard_uid
in: body
type: string
description: "The Grafana dashboard UID."
- name: expires_in
in: body
type: integer
description: "Snapshot expiry in seconds."
call: "grafana.create-snapshot"
with:
dashboard_uid: "{{dashboard_uid}}"
expires: "{{expires_in}}"
outputParameters:
- name: snapshot_url
type: string
mapping: "$.url"
- name: snapshot_id
type: string
mapping: "$.id"
consumes:
- type: http
namespace: grafana
baseUri: "https://grafana.bhp.com/api"
authentication:
type: bearer
token: "$secrets.grafana_token"
resources:
- name: snapshots
path: "/snapshots"
operations:
- name: create-snapshot
method: POST
Aggregates emissions data from Snowflake, calculates carbon intensity via Azure ML, updates regulatory reports in SharePoint, and notifies sustainability officers via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Greenhouse Gas Emissions Reporting Pipeline"
description: "Aggregates emissions data from Snowflake, calculates carbon intensity via Azure ML, updates regulatory reports in SharePoint, and notifies sustainability officers via Microsoft Outlook."
tags:
- emissions
- sustainability
- snowflake
- azure-machine-learning
- sharepoint
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: ghg-reporting
port: 8080
tools:
- name: report-emissions
description: "Aggregate emissions, calculate intensity, update reports, and notify."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: sustainability_email
in: body
type: string
description: "Sustainability team email."
steps:
- name: get-emissions
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM GHG_EMISSIONS WHERE mine_site = '{{mine_site}}' AND period >= DATEADD(quarter, -1, CURRENT_DATE())"
warehouse: "SUSTAINABILITY_WH"
- name: calculate-intensity
type: call
call: "azureml.score"
with:
model_type: "carbon_intensity"
data: "{{get-emissions.results}}"
- name: upload-report
type: call
call: "sharepoint.upload-file"
with:
site_id: "sustainability_site"
folder_path: "EmissionsReports/{{mine_site}}"
file_name: "ghg_report_{{mine_site}}.pdf"
- name: notify-sustainability
type: call
call: "outlook.send-mail"
with:
to: "{{sustainability_email}}"
subject: "GHG Emissions Report: {{mine_site}}"
body: "Total CO2e: {{calculate-intensity.total_co2e_tonnes}} tonnes. Intensity: {{calculate-intensity.intensity_per_tonne}} kg CO2e/t ore."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Reads a secret from HashiCorp Vault for secure credential retrieval in mining automation pipelines.
naftiko: "0.5"
info:
label: "HashiCorp Vault Secret Read"
description: "Reads a secret from HashiCorp Vault for secure credential retrieval in mining automation pipelines."
tags:
- secrets-management
- hashicorp-vault
capability:
exposes:
- type: mcp
namespace: vault
port: 8080
tools:
- name: read-secret
description: "Read a secret from HashiCorp Vault."
inputParameters:
- name: secret_path
in: body
type: string
description: "The secret path in Vault."
call: "vault.read-secret"
with:
path: "{{secret_path}}"
outputParameters:
- name: data
type: object
mapping: "$.data.data"
- name: metadata
type: object
mapping: "$.data.metadata"
consumes:
- type: http
namespace: vault
baseUri: "https://vault.bhp.com/v1"
authentication:
type: bearer
token: "$secrets.vault_token"
resources:
- name: secrets
path: "/{{path}}"
inputParameters:
- name: path
in: path
operations:
- name: read-secret
method: GET
Retrieves haul truck telematics from Snowflake, optimizes routes via Azure ML, updates dispatch schedules in SAP, and reports to the fleet manager via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Haul Truck Fleet Optimization Pipeline"
description: "Retrieves haul truck telematics from Snowflake, optimizes routes via Azure ML, updates dispatch schedules in SAP, and reports to the fleet manager via Microsoft Outlook."
tags:
- fleet-management
- optimization
- snowflake
- azure-machine-learning
- sap
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: haul-fleet
port: 8080
tools:
- name: optimize-haul-fleet
description: "Analyze telematics, optimize routes, update dispatch, and report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: shift
in: body
type: string
description: "The shift period (e.g., DAY, NIGHT)."
- name: manager_email
in: body
type: string
description: "Fleet manager email."
steps:
- name: get-telematics
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM HAUL_TRUCK_TELEMATICS WHERE mine_site = '{{mine_site}}' AND shift = '{{shift}}' AND telemetry_date = CURRENT_DATE()"
warehouse: "FLEET_WH"
- name: optimize-routes
type: call
call: "azureml.score"
with:
model_type: "haul_route_optimization"
data: "{{get-telematics.results}}"
- name: update-dispatch
type: call
call: "sap.update-dispatch-schedule"
with:
mine_site: "{{mine_site}}"
shift: "{{shift}}"
optimized_routes: "{{optimize-routes.routes}}"
- name: send-report
type: call
call: "outlook.send-mail"
with:
to: "{{manager_email}}"
subject: "Haul Fleet Optimization: {{mine_site}} - {{shift}} Shift"
body: "Fleet optimization complete. Trucks: {{optimize-routes.truck_count}}. Estimated fuel savings: {{optimize-routes.fuel_savings_pct}}%. Cycle time improvement: {{optimize-routes.cycle_improvement_min}} min."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_DISPATCH_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: dispatch
path: "/A_DispatchSchedule"
operations:
- name: update-dispatch-schedule
method: PATCH
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Monitors proximity detection data from Datadog, identifies high-risk vehicle interactions via Azure ML, creates safety records in ServiceNow, and alerts the safety team via PagerDuty.
naftiko: "0.5"
info:
label: "Heavy Vehicle Interaction Alert Pipeline"
description: "Monitors proximity detection data from Datadog, identifies high-risk vehicle interactions via Azure ML, creates safety records in ServiceNow, and alerts the safety team via PagerDuty."
tags:
- vehicle-safety
- proximity-detection
- datadog
- azure-machine-learning
- servicenow
- pagerduty
capability:
exposes:
- type: mcp
namespace: hvi-alert
port: 8080
tools:
- name: monitor-vehicle-interactions
description: "Monitor vehicle proximity, identify risks, create records, and alert."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: pagerduty_service
in: body
type: string
description: "PagerDuty service for safety."
steps:
- name: get-proximity-data
type: call
call: "datadog.query-metrics"
with:
query: "count:vehicle.proximity.breach{site:{{mine_site}}} by {vehicle_pair}"
from: "-4h"
- name: identify-risks
type: call
call: "azureml.score"
with:
model_type: "hvi_risk"
data: "{{get-proximity-data.series}}"
- name: create-record
type: call
call: "servicenow.create-incident"
with:
short_description: "HVI alert: {{mine_site}} — {{identify-risks.high_risk_count}} high-risk interactions"
description: "Total breaches: {{identify-risks.total_breaches}}. High risk: {{identify-risks.high_risk_count}}. Worst pair: {{identify-risks.worst_pair}}."
urgency: "1"
category: "safety"
- name: alert-safety
type: call
call: "pagerduty.create-incident"
with:
service_id: "{{pagerduty_service}}"
title: "HVI: {{mine_site}} — {{identify-risks.high_risk_count}} high-risk vehicle interactions"
urgency: "high"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: pagerduty
baseUri: "https://api.pagerduty.com"
authentication:
type: bearer
token: "$secrets.pagerduty_token"
resources:
- name: incidents
path: "/incidents"
operations:
- name: create-incident
method: POST
Retrieves a HubSpot contact by email for mining industry marketing and stakeholder engagement.
naftiko: "0.5"
info:
label: "HubSpot Mining Marketing Contact Lookup"
description: "Retrieves a HubSpot contact by email for mining industry marketing and stakeholder engagement."
tags:
- marketing
- hubspot
- contacts
capability:
exposes:
- type: mcp
namespace: marketing-crm
port: 8080
tools:
- name: get-contact
description: "Look up a HubSpot contact by email."
inputParameters:
- name: email
in: body
type: string
description: "The contact email address."
call: "hubspot.get-contact-by-email"
with:
email: "{{email}}"
outputParameters:
- name: name
type: string
mapping: "$.properties.firstname"
- name: company
type: string
mapping: "$.properties.company"
- name: lifecycle_stage
type: string
mapping: "$.properties.lifecyclestage"
consumes:
- type: http
namespace: hubspot
baseUri: "https://api.hubapi.com/crm/v3"
authentication:
type: bearer
token: "$secrets.hubspot_token"
resources:
- name: contacts
path: "/objects/contacts/{{email}}?idProperty=email"
inputParameters:
- name: email
in: path
operations:
- name: get-contact-by-email
method: GET
Retrieves ILUA obligations from ServiceNow, checks compliance deadlines in Snowflake, uploads reports to SharePoint, and notifies community relations via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Indigenous Land Use Agreement Tracking Pipeline"
description: "Retrieves ILUA obligations from ServiceNow, checks compliance deadlines in Snowflake, uploads reports to SharePoint, and notifies community relations via Microsoft Outlook."
tags:
- community-relations
- compliance
- servicenow
- snowflake
- sharepoint
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: ilua-tracking
port: 8080
tools:
- name: track-ilua-compliance
description: "Track ILUA obligations, check deadlines, upload reports, and notify."
inputParameters:
- name: agreement_id
in: body
type: string
description: "The ILUA agreement identifier."
- name: community_email
in: body
type: string
description: "Community relations team email."
steps:
- name: get-obligations
type: call
call: "servicenow.get-records"
with:
table: "u_ilua_obligation"
query: "agreement_id={{agreement_id}}"
- name: check-deadlines
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL CHECK_ILUA_DEADLINES('{{agreement_id}}')"
warehouse: "COMPLIANCE_WH"
- name: upload-report
type: call
call: "sharepoint.upload-file"
with:
site_id: "community_relations_site"
folder_path: "ILUA/{{agreement_id}}"
file_name: "compliance_report_{{agreement_id}}.pdf"
- name: notify-team
type: call
call: "outlook.send-mail"
with:
to: "{{community_email}}"
subject: "ILUA Compliance: {{agreement_id}}"
body: "Obligations: {{get-obligations.count}}. Overdue: {{check-deadlines.overdue_count}}. Next due: {{check-deadlines.next_deadline}}."
consumes:
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: get-records
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Triggers an Informatica Cloud data integration task for mining data ETL processing.
naftiko: "0.5"
info:
label: "Informatica ETL Pipeline Trigger"
description: "Triggers an Informatica Cloud data integration task for mining data ETL processing."
tags:
- data-integration
- informatica
- etl
capability:
exposes:
- type: mcp
namespace: data-integration
port: 8080
tools:
- name: trigger-informatica-task
description: "Trigger an Informatica Cloud task by task ID."
inputParameters:
- name: task_id
in: body
type: string
description: "The Informatica Cloud task ID."
- name: task_type
in: body
type: string
description: "The task type (e.g., DSS, MTT)."
call: "informatica.start-task"
with:
taskId: "{{task_id}}"
taskType: "{{task_type}}"
consumes:
- type: http
namespace: informatica
baseUri: "https://na1.dm-us.informaticacloud.com/saas/api/v2"
authentication:
type: bearer
token: "$secrets.informatica_token"
resources:
- name: jobs
path: "/job"
operations:
- name: start-task
method: POST
Fetches a Jira issue by key and returns summary, status, assignee, and priority for mining project management.
naftiko: "0.5"
info:
label: "Jira Issue Retrieval"
description: "Fetches a Jira issue by key and returns summary, status, assignee, and priority for mining project management."
tags:
- project-management
- jira
- mining
capability:
exposes:
- type: mcp
namespace: project-tracking
port: 8080
tools:
- name: get-jira-issue
description: "Look up a Jira issue by key."
inputParameters:
- name: issue_key
in: body
type: string
description: "The Jira issue key (e.g., MINE-1234)."
call: "jira.get-issue"
with:
issue_key: "{{issue_key}}"
outputParameters:
- name: summary
type: string
mapping: "$.fields.summary"
- name: status
type: string
mapping: "$.fields.status.name"
- name: assignee
type: string
mapping: "$.fields.assignee.displayName"
- name: priority
type: string
mapping: "$.fields.priority.name"
consumes:
- type: http
namespace: jira
baseUri: "https://bhp.atlassian.net/rest/api/3"
authentication:
type: basic
username: "$secrets.jira_user"
password: "$secrets.jira_api_token"
resources:
- name: issues
path: "/issue/{{issue_key}}"
inputParameters:
- name: issue_key
in: path
operations:
- name: get-issue
method: GET
Creates a LinkedIn company page post for BHP employer branding and mining career promotion.
naftiko: "0.5"
info:
label: "LinkedIn Employer Branding Post"
description: "Creates a LinkedIn company page post for BHP employer branding and mining career promotion."
tags:
- hr
- marketing
- linkedin
- recruitment
capability:
exposes:
- type: mcp
namespace: social-media
port: 8080
tools:
- name: create-linkedin-post
description: "Create a LinkedIn company page post."
inputParameters:
- name: text
in: body
type: string
description: "The post text content."
- name: image_url
in: body
type: string
description: "Optional image URL to include."
call: "linkedin.create-post"
with:
text: "{{text}}"
image_url: "{{image_url}}"
consumes:
- type: http
namespace: linkedin
baseUri: "https://api.linkedin.com/v2"
authentication:
type: bearer
token: "$secrets.linkedin_token"
resources:
- name: posts
path: "/ugcPosts"
operations:
- name: create-post
method: POST
Triggers a Microsoft Power Automate flow to populate an Excel report template with mining data.
naftiko: "0.5"
info:
label: "Microsoft Excel Report Generation"
description: "Triggers a Microsoft Power Automate flow to populate an Excel report template with mining data."
tags:
- reporting
- microsoft-excel
- microsoft-power-automate
capability:
exposes:
- type: mcp
namespace: report-gen
port: 8080
tools:
- name: generate-excel-report
description: "Trigger a Power Automate flow to generate an Excel report."
inputParameters:
- name: flow_url
in: body
type: string
description: "The Power Automate HTTP trigger URL."
- name: report_name
in: body
type: string
description: "The output report file name."
call: "powerautomate.trigger-flow"
with:
flow_url: "{{flow_url}}"
report_name: "{{report_name}}"
consumes:
- type: http
namespace: powerautomate
baseUri: "https://prod-00.australiasoutheast.logic.azure.com"
authentication:
type: bearer
token: "$secrets.powerautomate_token"
resources:
- name: flows
path: "/workflows/{{flow_url}}"
inputParameters:
- name: flow_url
in: path
operations:
- name: trigger-flow
method: POST
Sends an email via Microsoft Outlook for automated mining operations communications.
naftiko: "0.5"
info:
label: "Microsoft Outlook Email Notification"
description: "Sends an email via Microsoft Outlook for automated mining operations communications."
tags:
- communication
- microsoft-outlook
- email
capability:
exposes:
- type: mcp
namespace: email-comms
port: 8080
tools:
- name: send-email
description: "Send an email via Microsoft Outlook."
inputParameters:
- name: to
in: body
type: string
description: "Recipient email address."
- name: subject
in: body
type: string
description: "Email subject line."
- name: body
in: body
type: string
description: "Email body content."
call: "outlook.send-mail"
with:
to: "{{to}}"
subject: "{{subject}}"
body: "{{body}}"
consumes:
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Triggers a Microsoft Power Automate flow for automated mining operations workflows and approval processes.
naftiko: "0.5"
info:
label: "Microsoft Power Automate Flow Trigger"
description: "Triggers a Microsoft Power Automate flow for automated mining operations workflows and approval processes."
tags:
- automation
- microsoft-power-automate
capability:
exposes:
- type: mcp
namespace: power-automate
port: 8080
tools:
- name: trigger-flow
description: "Trigger a Power Automate flow via HTTP trigger."
inputParameters:
- name: flow_url
in: body
type: string
description: "The Power Automate HTTP trigger URL."
- name: payload
in: body
type: string
description: "JSON payload for the flow."
call: "powerautomate.trigger"
with:
url: "{{flow_url}}"
body: "{{payload}}"
outputParameters:
- name: run_id
type: string
mapping: "$.runId"
- name: status
type: string
mapping: "$.status"
consumes:
- type: http
namespace: powerautomate
baseUri: "https://prod-aus.logic.azure.com"
authentication:
type: apiKey
name: "sig"
in: query
value: "$secrets.power_automate_sig"
resources:
- name: triggers
path: "/workflows/trigger"
operations:
- name: trigger
method: POST
Sends a message to a Microsoft Teams channel for operational notifications.
naftiko: "0.5"
info:
label: "Microsoft Teams Channel Message"
description: "Sends a message to a Microsoft Teams channel for operational notifications."
tags:
- collaboration
- microsoft-teams
- notification
capability:
exposes:
- type: mcp
namespace: team-comms
port: 8080
tools:
- name: send-channel-message
description: "Post a message to a Microsoft Teams channel."
inputParameters:
- name: team_id
in: body
type: string
description: "The Microsoft Teams team ID."
- name: channel_id
in: body
type: string
description: "The channel ID within the team."
- name: message
in: body
type: string
description: "The message text to post."
call: "msteams.post-channel-message"
with:
team_id: "{{team_id}}"
channel_id: "{{channel_id}}"
text: "{{message}}"
consumes:
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{team_id}}/channels/{{channel_id}}/messages"
inputParameters:
- name: team_id
in: path
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves closure obligations from ServiceNow, calculates rehabilitation costs in Snowflake, updates financial provisions in SAP, and distributes the closure plan via SharePoint.
naftiko: "0.5"
info:
label: "Mine Closure Planning Orchestrator"
description: "Retrieves closure obligations from ServiceNow, calculates rehabilitation costs in Snowflake, updates financial provisions in SAP, and distributes the closure plan via SharePoint."
tags:
- mine-closure
- rehabilitation
- servicenow
- snowflake
- sap
- sharepoint
capability:
exposes:
- type: mcp
namespace: mine-closure
port: 8080
tools:
- name: plan-mine-closure
description: "Retrieve obligations, calculate costs, update provisions, and publish plan."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
steps:
- name: get-obligations
type: call
call: "servicenow.get-records"
with:
table: "u_closure_obligation"
query: "mine_site={{mine_site}}"
- name: calculate-costs
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL CALCULATE_CLOSURE_COSTS('{{mine_site}}')"
warehouse: "FINANCE_WH"
- name: update-provisions
type: call
call: "sap.update-provision"
with:
mine_site: "{{mine_site}}"
total_cost: "{{calculate-costs.total_rehabilitation_cost}}"
provision_amount: "{{calculate-costs.required_provision}}"
- name: publish-plan
type: call
call: "sharepoint.upload-file"
with:
site_id: "closure_planning_site"
folder_path: "ClosurePlans/{{mine_site}}"
file_name: "closure_plan_{{mine_site}}.pdf"
consumes:
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: get-records
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PROVISION_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: provisions
path: "/A_Provision"
operations:
- name: update-provision
method: PATCH
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
Monitors groundwater levels from Datadog sensors, forecasts inflow via Azure ML, adjusts pump schedules in SAP, and alerts hydrogeology via Microsoft Teams.
naftiko: "0.5"
info:
label: "Mine Dewatering Management Pipeline"
description: "Monitors groundwater levels from Datadog sensors, forecasts inflow via Azure ML, adjusts pump schedules in SAP, and alerts hydrogeology via Microsoft Teams."
tags:
- dewatering
- hydrogeology
- datadog
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: mine-dewatering
port: 8080
tools:
- name: manage-dewatering
description: "Monitor groundwater, forecast inflow, adjust pumps, and alert team."
inputParameters:
- name: pit_id
in: body
type: string
description: "The open pit identifier."
- name: hydro_channel
in: body
type: string
description: "Microsoft Teams hydrogeology channel."
steps:
- name: get-water-levels
type: call
call: "datadog.query-metrics"
with:
query: "avg:mine.groundwater.level{pit:{{pit_id}}} by {bore}"
from: "-24h"
- name: forecast-inflow
type: call
call: "azureml.score"
with:
model_type: "groundwater_inflow"
data: "{{get-water-levels.series}}"
- name: adjust-pumps
type: call
call: "sap.update-pump-schedule"
with:
pit_id: "{{pit_id}}"
required_rate: "{{forecast-inflow.required_pump_rate_lps}}"
- name: alert-hydro
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{hydro_channel}}"
text: "Dewatering pit {{pit_id}}: Inflow forecast {{forecast-inflow.inflow_rate_lps}} L/s. Required pump rate: {{forecast-inflow.required_pump_rate_lps}} L/s. Schedule updated."
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PUMP_SCHEDULE_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: pump-schedules
path: "/A_PumpSchedule"
operations:
- name: update-pump-schedule
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
When Azure Machine Learning detects an anomaly in equipment telemetry, creates a SAP PM work order, opens a ServiceNow ticket, and alerts the maintenance team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Mine Equipment Predictive Maintenance Pipeline"
description: "When Azure Machine Learning detects an anomaly in equipment telemetry, creates a SAP PM work order, opens a ServiceNow ticket, and alerts the maintenance team via Microsoft Teams."
tags:
- predictive-maintenance
- azure-machine-learning
- sap
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: predictive-maintenance
port: 8080
tools:
- name: trigger-predictive-maintenance
description: "Given an equipment ID with detected anomaly, orchestrate preventive work order creation across SAP, ServiceNow, and Microsoft Teams."
inputParameters:
- name: equipment_id
in: body
type: string
description: "The SAP equipment ID."
- name: anomaly_type
in: body
type: string
description: "Type of anomaly detected (e.g., vibration, temperature, pressure)."
- name: confidence_score
in: body
type: string
description: "ML model confidence score."
- name: maintenance_channel
in: body
type: string
description: "Microsoft Teams channel for maintenance team."
steps:
- name: create-work-order
type: call
call: "sap.create-maintenance-order"
with:
equipment: "{{equipment_id}}"
maintenance_type: "PM03"
description: "Predictive maintenance: {{anomaly_type}} anomaly detected (confidence: {{confidence_score}})"
- name: open-ticket
type: call
call: "servicenow.create-incident"
with:
short_description: "Predictive maintenance: {{equipment_id}} - {{anomaly_type}}"
category: "equipment_maintenance"
description: "ML model detected {{anomaly_type}} anomaly on {{equipment_id}} with confidence {{confidence_score}}. SAP WO: {{create-work-order.order_number}}."
- name: alert-team
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{maintenance_channel}}"
text: "PREDICTIVE ALERT: {{anomaly_type}} on equipment {{equipment_id}} (confidence: {{confidence_score}}). SAP WO: {{create-work-order.order_number}}. ServiceNow: {{open-ticket.number}}."
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: maintenance-orders
path: "/MaintenanceOrder"
operations:
- name: create-maintenance-order
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Compares actual mining progress from SAP against the mine plan in Snowflake, calculates variance, creates action items in Jira, and refreshes the compliance dashboard in Power BI.
naftiko: "0.5"
info:
label: "Mine Plan Compliance Check Pipeline"
description: "Compares actual mining progress from SAP against the mine plan in Snowflake, calculates variance, creates action items in Jira, and refreshes the compliance dashboard in Power BI."
tags:
- mine-planning
- compliance
- sap
- snowflake
- jira
- power-bi
capability:
exposes:
- type: mcp
namespace: mine-plan-compliance
port: 8080
tools:
- name: check-mine-plan-compliance
description: "Compare actual vs plan, calculate variance, create actions, and report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: project_key
in: body
type: string
description: "Jira project key."
- name: dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: group_id
in: body
type: string
description: "Power BI workspace ID."
steps:
- name: get-actuals
type: call
call: "sap.get-production-totals"
with:
mine_site: "{{mine_site}}"
- name: compare-plan
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL COMPARE_MINE_PLAN_VS_ACTUAL('{{mine_site}}')"
warehouse: "PLANNING_WH"
- name: create-actions
type: call
call: "jira.create-issue"
with:
project_key: "{{project_key}}"
summary: "Mine plan variance: {{mine_site}} — {{compare-plan.variance_pct}}%"
description: "Planned: {{compare-plan.planned_tonnes}}t. Actual: {{compare-plan.actual_tonnes}}t. Variance: {{compare-plan.variance_pct}}%."
issue_type: "Task"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PRODUCTION_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: production
path: "/A_ProductionTotals"
operations:
- name: get-production-totals
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: jira
baseUri: "https://bhp.atlassian.net/rest/api/3"
authentication:
type: basic
username: "$secrets.jira_user"
password: "$secrets.jira_api_token"
resources:
- name: issues
path: "/issue"
operations:
- name: create-issue
method: POST
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
On new mine site hire creation in Workday, opens a ServiceNow onboarding ticket, provisions a SharePoint safety documentation folder, and sends a Microsoft Teams welcome message.
naftiko: "0.5"
info:
label: "Mine Site Employee Onboarding Orchestrator"
description: "On new mine site hire creation in Workday, opens a ServiceNow onboarding ticket, provisions a SharePoint safety documentation folder, and sends a Microsoft Teams welcome message."
tags:
- hr
- onboarding
- mining
- workday
- servicenow
- sharepoint
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: hr-onboarding
port: 8080
tools:
- name: trigger-site-onboarding
description: "Given a Workday employee ID and mine site, orchestrate the full onboarding sequence across ServiceNow, SharePoint, and Microsoft Teams."
inputParameters:
- name: workday_employee_id
in: body
type: string
description: "The Workday worker ID for the new hire."
- name: mine_site
in: body
type: string
description: "The mine site location code."
- name: start_date
in: body
type: string
description: "The employee start date in YYYY-MM-DD format."
steps:
- name: get-employee
type: call
call: "workday.get-worker"
with:
worker_id: "{{workday_employee_id}}"
- name: open-ticket
type: call
call: "servicenow.create-incident"
with:
short_description: "New mine site hire onboarding: {{get-employee.full_name}}"
category: "hr_onboarding"
assigned_group: "Site_Operations_{{mine_site}}"
description: "Onboarding for {{get-employee.full_name}} starting {{start_date}} at {{mine_site}}."
- name: provision-folder
type: call
call: "sharepoint.create-folder"
with:
site_id: "safety_onboarding_site"
folder_path: "SiteOnboarding/{{get-employee.full_name}}_{{mine_site}}"
- name: send-welcome
type: call
call: "msteams.send-message"
with:
recipient_upn: "{{get-employee.work_email}}"
text: "Welcome to BHP {{mine_site}}, {{get-employee.first_name}}! Your onboarding ticket is {{open-ticket.number}}. Safety docs: {{provision-folder.url}}."
consumes:
- type: http
namespace: workday
baseUri: "https://wd2-impl-services1.workday.com/ccx/api/v1"
authentication:
type: bearer
token: "$secrets.workday_token"
resources:
- name: workers
path: "/workers/{{worker_id}}"
inputParameters:
- name: worker_id
in: path
operations:
- name: get-worker
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: drive-items
path: "/{{site_id}}/drive/root:/{{folder_path}}"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
operations:
- name: create-folder
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: messages
path: "/users/{{recipient_upn}}/sendMail"
inputParameters:
- name: recipient_upn
in: path
operations:
- name: send-message
method: POST
Monitors underground ventilation sensors via Datadog, optimizes airflow via Azure ML, updates fan schedules in SAP, and alerts the ventilation officer via Microsoft Teams.
naftiko: "0.5"
info:
label: "Mine Ventilation Optimization Pipeline"
description: "Monitors underground ventilation sensors via Datadog, optimizes airflow via Azure ML, updates fan schedules in SAP, and alerts the ventilation officer via Microsoft Teams."
tags:
- underground-mining
- ventilation
- datadog
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: mine-ventilation
port: 8080
tools:
- name: optimize-ventilation
description: "Monitor ventilation, optimize airflow, update fans, and alert."
inputParameters:
- name: mine_id
in: body
type: string
description: "The underground mine identifier."
- name: vent_channel
in: body
type: string
description: "Microsoft Teams ventilation channel."
steps:
- name: get-airflow
type: call
call: "datadog.query-metrics"
with:
query: "avg:mine.ventilation.airflow{mine:{{mine_id}}} by {zone}"
from: "-4h"
- name: optimize-fans
type: call
call: "azureml.score"
with:
model_type: "ventilation_optimization"
data: "{{get-airflow.series}}"
- name: update-schedules
type: call
call: "sap.update-fan-schedule"
with:
mine_id: "{{mine_id}}"
fan_settings: "{{optimize-fans.recommended_settings}}"
- name: alert-officer
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{vent_channel}}"
text: "Ventilation {{mine_id}}: Optimized. Energy savings: {{optimize-fans.energy_savings_pct}}%. All zones above minimum airflow requirements."
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_VENTILATION_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: fan-schedules
path: "/A_FanSchedule"
operations:
- name: update-fan-schedule
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Queries processing plant throughput from Snowflake, detects bottlenecks via Azure ML, creates maintenance requests in SAP, and refreshes the operations dashboard in Power BI.
naftiko: "0.5"
info:
label: "Mineral Processing Throughput Pipeline"
description: "Queries processing plant throughput from Snowflake, detects bottlenecks via Azure ML, creates maintenance requests in SAP, and refreshes the operations dashboard in Power BI."
tags:
- mineral-processing
- throughput
- snowflake
- azure-machine-learning
- sap
- power-bi
capability:
exposes:
- type: mcp
namespace: processing-throughput
port: 8080
tools:
- name: analyze-throughput
description: "Analyze processing throughput, detect bottlenecks, and update systems."
inputParameters:
- name: plant_id
in: body
type: string
description: "The processing plant identifier."
- name: dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: group_id
in: body
type: string
description: "Power BI workspace ID."
steps:
- name: get-throughput
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM PLANT_THROUGHPUT WHERE plant_id = '{{plant_id}}' AND reading_time >= DATEADD(hour, -24, CURRENT_TIMESTAMP())"
warehouse: "OPERATIONS_WH"
- name: detect-bottlenecks
type: call
call: "azureml.score"
with:
model_type: "throughput_bottleneck"
data: "{{get-throughput.results}}"
- name: create-maintenance
type: call
call: "sap.create-work-order"
with:
order_type: "CORRECTIVE"
plant_id: "{{plant_id}}"
description: "Bottleneck detected: {{detect-bottlenecks.bottleneck_unit}}. Throughput loss: {{detect-bottlenecks.loss_tph}} tph."
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: work-orders
path: "/MaintenanceOrder"
operations:
- name: create-work-order
method: POST
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
When a mining permit nears expiry, retrieves the permit from Oracle EBS, creates a ServiceNow change request, assigns to the legal team via Workday, and notifies via Microsoft Teams.
naftiko: "0.5"
info:
label: "Mining Permit Renewal Pipeline"
description: "When a mining permit nears expiry, retrieves the permit from Oracle EBS, creates a ServiceNow change request, assigns to the legal team via Workday, and notifies via Microsoft Teams."
tags:
- compliance
- permits
- oracle-e-business-suite
- servicenow
- workday
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: permit-management
port: 8080
tools:
- name: trigger-permit-renewal
description: "Given a permit ID, orchestrate renewal workflow across Oracle EBS, ServiceNow, Workday, and Microsoft Teams."
inputParameters:
- name: permit_id
in: body
type: string
description: "The Oracle EBS permit record ID."
- name: legal_assignee_id
in: body
type: string
description: "Workday employee ID of the legal team member."
- name: compliance_channel
in: body
type: string
description: "Microsoft Teams channel for compliance updates."
steps:
- name: get-permit
type: call
call: "oracle-ebs.get-permit"
with:
permit_id: "{{permit_id}}"
- name: get-assignee
type: call
call: "workday.get-worker"
with:
worker_id: "{{legal_assignee_id}}"
- name: create-change-request
type: call
call: "servicenow.create-change-request"
with:
short_description: "Mining permit renewal: {{get-permit.permit_number}}"
description: "Permit {{get-permit.permit_number}} for {{get-permit.mine_site}} expires {{get-permit.expiry_date}}. Assigned to {{get-assignee.full_name}}."
assigned_to: "{{get-assignee.work_email}}"
- name: notify-compliance
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{compliance_channel}}"
text: "Permit renewal initiated: {{get-permit.permit_number}} for {{get-permit.mine_site}}. Expires: {{get-permit.expiry_date}}. Assigned to {{get-assignee.full_name}}. CR: {{create-change-request.number}}."
consumes:
- type: http
namespace: oracle-ebs
baseUri: "https://bhp-ebs.oraclecloud.com/webservices/rest/v1"
authentication:
type: bearer
token: "$secrets.oracle_ebs_token"
resources:
- name: permits
path: "/permits/{{permit_id}}"
inputParameters:
- name: permit_id
in: path
operations:
- name: get-permit
method: GET
- type: http
namespace: workday
baseUri: "https://wd2-impl-services1.workday.com/ccx/api/v1"
authentication:
type: bearer
token: "$secrets.workday_token"
resources:
- name: workers
path: "/workers/{{worker_id}}"
inputParameters:
- name: worker_id
in: path
operations:
- name: get-worker
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: change-requests
path: "/table/change_request"
operations:
- name: create-change-request
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves application performance metrics from New Relic for BHP mine site applications.
naftiko: "0.5"
info:
label: "New Relic Site Performance Lookup"
description: "Retrieves application performance metrics from New Relic for BHP mine site applications."
tags:
- monitoring
- new-relic
- performance
capability:
exposes:
- type: mcp
namespace: apm-monitoring
port: 8080
tools:
- name: get-app-performance
description: "Look up New Relic APM metrics for an application."
inputParameters:
- name: app_id
in: body
type: string
description: "The New Relic application ID."
call: "newrelic.get-app-metrics"
with:
app_id: "{{app_id}}"
outputParameters:
- name: response_time
type: string
mapping: "$.application.application_summary.response_time"
- name: throughput
type: string
mapping: "$.application.application_summary.throughput"
- name: error_rate
type: string
mapping: "$.application.application_summary.error_rate"
consumes:
- type: http
namespace: newrelic
baseUri: "https://api.newrelic.com/v2"
authentication:
type: apiKey
key: "$secrets.newrelic_api_key"
resources:
- name: applications
path: "/applications/{{app_id}}.json"
inputParameters:
- name: app_id
in: path
operations:
- name: get-app-metrics
method: GET
Creates an OpsGenie alert for mine processing plant issues, routing to the on-call operations team.
naftiko: "0.5"
info:
label: "OpsGenie Alert Creation"
description: "Creates an OpsGenie alert for mine processing plant issues, routing to the on-call operations team."
tags:
- alerting
- opsgenie
capability:
exposes:
- type: mcp
namespace: opsgenie
port: 8080
tools:
- name: create-alert
description: "Create a new OpsGenie alert."
inputParameters:
- name: message
in: body
type: string
description: "Alert message."
- name: priority
in: body
type: string
description: "Priority level: P1 through P5."
- name: team
in: body
type: string
description: "Responder team name."
call: "opsgenie.create-alert"
with:
message: "{{message}}"
priority: "{{priority}}"
team: "{{team}}"
outputParameters:
- name: request_id
type: string
mapping: "$.requestId"
- name: result
type: string
mapping: "$.result"
consumes:
- type: http
namespace: opsgenie
baseUri: "https://api.opsgenie.com/v2"
authentication:
type: apiKey
name: "Authorization"
in: header
value: "GenieKey $secrets.opsgenie_api_key"
resources:
- name: alerts
path: "/alerts"
operations:
- name: create-alert
method: POST
Retrieves Oracle Cloud Infrastructure cost summary for mining IT operations by compartment and date range.
naftiko: "0.5"
info:
label: "Oracle Cloud Cost Report"
description: "Retrieves Oracle Cloud Infrastructure cost summary for mining IT operations by compartment and date range."
tags:
- cloud
- oracle-cloud
- cost-management
- finance
capability:
exposes:
- type: mcp
namespace: cloud-finance
port: 8080
tools:
- name: get-cost-report
description: "Retrieve OCI cost summary for a compartment and date range."
inputParameters:
- name: compartment_id
in: body
type: string
description: "The OCI compartment OCID."
- name: start_date
in: body
type: string
description: "Start date in YYYY-MM-DD format."
- name: end_date
in: body
type: string
description: "End date in YYYY-MM-DD format."
call: "oci.get-cost-summary"
with:
compartment_id: "{{compartment_id}}"
start_date: "{{start_date}}"
end_date: "{{end_date}}"
consumes:
- type: http
namespace: oci
baseUri: "https://usagecost.oraclecloud.com/20200107"
authentication:
type: bearer
token: "$secrets.oci_token"
resources:
- name: cost-summaries
path: "/usage"
operations:
- name: get-cost-summary
method: POST
Pulls geological survey data from Snowflake, runs the Azure ML ore grade prediction model, updates SAP mine planning, and notifies the geology team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Ore Grade Forecasting Pipeline"
description: "Pulls geological survey data from Snowflake, runs the Azure ML ore grade prediction model, updates SAP mine planning, and notifies the geology team via Microsoft Teams."
tags:
- geology
- forecasting
- snowflake
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: ore-forecasting
port: 8080
tools:
- name: forecast-ore-grade
description: "Given a mine block ID, run ore grade forecasting across Snowflake, Azure ML, SAP, and Microsoft Teams."
inputParameters:
- name: block_id
in: body
type: string
description: "The mine block identifier."
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: geology_channel
in: body
type: string
description: "Microsoft Teams channel for geology team."
steps:
- name: get-survey-data
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM GEOLOGICAL_SURVEYS WHERE block_id = '{{block_id}}' ORDER BY survey_date DESC LIMIT 100"
warehouse: "GEOLOGY_WH"
- name: predict-grade
type: call
call: "azureml.score"
with:
equipment_id: "{{block_id}}"
data: "{{get-survey-data.results}}"
- name: update-mine-plan
type: call
call: "sap.update-mine-block"
with:
block_id: "{{block_id}}"
predicted_grade: "{{predict-grade.predicted_grade}}"
confidence: "{{predict-grade.confidence}}"
- name: notify-geology
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{geology_channel}}"
text: "Ore grade forecast for block {{block_id}} at {{mine_site}}: Predicted grade {{predict-grade.predicted_grade}}% (confidence: {{predict-grade.confidence}}). SAP mine plan updated."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MINE_PLANNING_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: mine-blocks
path: "/A_MineBlock('{{block_id}}')"
inputParameters:
- name: block_id
in: path
operations:
- name: update-mine-block
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Queries stockpile survey data from Snowflake, reconciles against production records in SAP, flags variances in Jira, and refreshes the reconciliation dashboard in Power BI.
naftiko: "0.5"
info:
label: "Ore Stockpile Reconciliation Pipeline"
description: "Queries stockpile survey data from Snowflake, reconciles against production records in SAP, flags variances in Jira, and refreshes the reconciliation dashboard in Power BI."
tags:
- stockpile
- reconciliation
- snowflake
- sap
- jira
- power-bi
capability:
exposes:
- type: mcp
namespace: stockpile-recon
port: 8080
tools:
- name: reconcile-stockpiles
description: "Reconcile stockpile surveys against production and flag variances."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: project_key
in: body
type: string
description: "Jira project key."
- name: dataset_id
in: body
type: string
description: "Power BI dataset ID."
- name: group_id
in: body
type: string
description: "Power BI workspace ID."
steps:
- name: get-survey-data
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL RECONCILE_STOCKPILES('{{mine_site}}')"
warehouse: "OPERATIONS_WH"
- name: get-production
type: call
call: "sap.get-production-totals"
with:
mine_site: "{{mine_site}}"
- name: flag-variances
type: call
call: "jira.create-issue"
with:
project_key: "{{project_key}}"
summary: "Stockpile variance: {{mine_site}} — {{get-survey-data.variance_pct}}%"
description: "Survey total: {{get-survey-data.survey_tonnes}}t. Production records: {{get-production.total_tonnes}}t. Variance: {{get-survey-data.variance_pct}}%."
issue_type: "Task"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PRODUCTION_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: production
path: "/A_ProductionTotals"
operations:
- name: get-production-totals
method: GET
- type: http
namespace: jira
baseUri: "https://bhp.atlassian.net/rest/api/3"
authentication:
type: basic
username: "$secrets.jira_user"
password: "$secrets.jira_api_token"
resources:
- name: issues
path: "/issue"
operations:
- name: create-issue
method: POST
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
Triggers a PagerDuty incident for critical mine site or processing plant alerts.
naftiko: "0.5"
info:
label: "PagerDuty Incident Trigger"
description: "Triggers a PagerDuty incident for critical mine site or processing plant alerts."
tags:
- incident-management
- pagerduty
capability:
exposes:
- type: mcp
namespace: pagerduty
port: 8080
tools:
- name: trigger-incident
description: "Create a new PagerDuty incident."
inputParameters:
- name: service_id
in: body
type: string
description: "The PagerDuty service ID."
- name: title
in: body
type: string
description: "Incident title."
- name: urgency
in: body
type: string
description: "Incident urgency: high or low."
call: "pagerduty.create-incident"
with:
service_id: "{{service_id}}"
title: "{{title}}"
urgency: "{{urgency}}"
outputParameters:
- name: incident_id
type: string
mapping: "$.incident.id"
- name: incident_url
type: string
mapping: "$.incident.html_url"
consumes:
- type: http
namespace: pagerduty
baseUri: "https://api.pagerduty.com"
authentication:
type: bearer
token: "$secrets.pagerduty_token"
resources:
- name: incidents
path: "/incidents"
operations:
- name: create-incident
method: POST
Retrieves a Palo Alto Networks firewall security rule for mine site network perimeter.
naftiko: "0.5"
info:
label: "Palo Alto Networks Firewall Rule Lookup"
description: "Retrieves a Palo Alto Networks firewall security rule for mine site network perimeter."
tags:
- security
- palo-alto-networks
- firewall
capability:
exposes:
- type: mcp
namespace: network-security
port: 8080
tools:
- name: get-firewall-rule
description: "Look up a Palo Alto firewall rule by name."
inputParameters:
- name: rule_name
in: body
type: string
description: "The firewall rule name."
call: "paloalto.get-security-rule"
with:
rule_name: "{{rule_name}}"
outputParameters:
- name: source_zones
type: string
mapping: "$.result.entry.from.member"
- name: destination_zones
type: string
mapping: "$.result.entry.to.member"
- name: action
type: string
mapping: "$.result.entry.action"
consumes:
- type: http
namespace: paloalto
baseUri: "https://bhp-fw.paloaltonetworks.com/restapi/v10.1"
authentication:
type: apiKey
key: "$secrets.paloalto_api_key"
resources:
- name: security-rules
path: "/Policies/SecurityRules?name={{rule_name}}"
inputParameters:
- name: rule_name
in: query
operations:
- name: get-security-rule
method: GET
Retrieves drone imagery from S3, processes point clouds in Snowflake, calculates volumes, stores results in Azure Blob, and notifies the survey team via Slack.
naftiko: "0.5"
info:
label: "Pit Survey Drone Orchestrator"
description: "Retrieves drone imagery from S3, processes point clouds in Snowflake, calculates volumes, stores results in Azure Blob, and notifies the survey team via Slack."
tags:
- drone-survey
- volumetrics
- amazon-s3
- snowflake
- azure-blob-storage
- slack
capability:
exposes:
- type: mcp
namespace: pit-survey
port: 8080
tools:
- name: process-pit-survey
description: "Process drone survey, calculate volumes, store results, and notify."
inputParameters:
- name: pit_id
in: body
type: string
description: "The pit identifier."
- name: bucket
in: body
type: string
description: "S3 bucket with drone imagery."
- name: imagery_key
in: body
type: string
description: "S3 key prefix for imagery."
- name: slack_channel
in: body
type: string
description: "Slack channel for survey team."
steps:
- name: get-imagery
type: call
call: "s3.get-object"
with:
bucket: "{{bucket}}"
key: "{{imagery_key}}"
- name: calculate-volumes
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL CALCULATE_PIT_VOLUMES('{{pit_id}}')"
warehouse: "SURVEY_WH"
- name: store-results
type: call
call: "azureblob.put-blob"
with:
container: "pit-surveys"
blob_name: "{{pit_id}}/volume_report.pdf"
- name: notify-survey
type: call
call: "slack.post-message"
with:
channel: "{{slack_channel}}"
text: "Pit survey {{pit_id}} complete. Volume: {{calculate-volumes.total_volume_bcm}} BCM. Change from last: {{calculate-volumes.volume_change_pct}}%. Report: {{store-results.url}}"
consumes:
- type: http
namespace: s3
baseUri: "https://{{bucket}}.s3.amazonaws.com"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: objects
path: "/{{key}}"
inputParameters:
- name: bucket
in: path
- name: key
in: path
operations:
- name: get-object
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureblob
baseUri: "https://bhp.blob.core.windows.net"
authentication:
type: apiKey
name: "x-ms-access-key"
in: header
value: "$secrets.azure_storage_key"
resources:
- name: blobs
path: "/{{container}}/{{blob_name}}"
inputParameters:
- name: container
in: path
- name: blob_name
in: path
operations:
- name: put-blob
method: PUT
- type: http
namespace: slack
baseUri: "https://slack.com/api"
authentication:
type: bearer
token: "$secrets.slack_bot_token"
resources:
- name: messages
path: "/chat.postMessage"
operations:
- name: post-message
method: POST
Retrieves vessel schedule from Snowflake, checks stockpile availability in SAP, updates the shipping plan in ServiceNow, and notifies logistics via Microsoft Teams.
naftiko: "0.5"
info:
label: "Port Shipping Coordination Pipeline"
description: "Retrieves vessel schedule from Snowflake, checks stockpile availability in SAP, updates the shipping plan in ServiceNow, and notifies logistics via Microsoft Teams."
tags:
- logistics
- shipping
- snowflake
- sap
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: port-shipping
port: 8080
tools:
- name: coordinate-shipping
description: "Check vessel schedule, stockpile availability, update plan, and notify."
inputParameters:
- name: port_id
in: body
type: string
description: "The port facility identifier."
- name: logistics_channel
in: body
type: string
description: "Microsoft Teams logistics channel."
steps:
- name: get-vessel-schedule
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM VESSEL_SCHEDULE WHERE port_id = '{{port_id}}' AND eta >= CURRENT_DATE() ORDER BY eta LIMIT 10"
warehouse: "LOGISTICS_WH"
- name: check-stockpile
type: call
call: "sap.get-stockpile-availability"
with:
port_id: "{{port_id}}"
- name: update-plan
type: call
call: "servicenow.update-record"
with:
table: "u_shipping_plan"
port_id: "{{port_id}}"
available_tonnes: "{{check-stockpile.available_tonnes}}"
- name: notify-logistics
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{logistics_channel}}"
text: "Port {{port_id}}: Next vessel ETA {{get-vessel-schedule.next_eta}}. Available stock: {{check-stockpile.available_tonnes}}t. Required: {{get-vessel-schedule.required_tonnes}}t."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_STOCKPILE_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: stockpiles
path: "/A_Stockpile"
operations:
- name: get-stockpile-availability
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: update-record
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Triggers a Power BI dataset refresh for mining operations dashboards and returns refresh status.
naftiko: "0.5"
info:
label: "Power BI Dashboard Refresh"
description: "Triggers a Power BI dataset refresh for mining operations dashboards and returns refresh status."
tags:
- analytics
- power-bi
- reporting
capability:
exposes:
- type: mcp
namespace: bi-reporting
port: 8080
tools:
- name: refresh-dataset
description: "Trigger a Power BI dataset refresh by dataset ID."
inputParameters:
- name: dataset_id
in: body
type: string
description: "The Power BI dataset ID."
- name: group_id
in: body
type: string
description: "The Power BI workspace (group) ID."
call: "powerbi.refresh-dataset"
with:
group_id: "{{group_id}}"
dataset_id: "{{dataset_id}}"
consumes:
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
Aggregates daily mine production data from SAP, refreshes the Power BI production dashboard, uploads a summary to SharePoint, and notifies operations leadership via Microsoft Teams.
naftiko: "0.5"
info:
label: "Production Output Reporting Pipeline"
description: "Aggregates daily mine production data from SAP, refreshes the Power BI production dashboard, uploads a summary to SharePoint, and notifies operations leadership via Microsoft Teams."
tags:
- production
- reporting
- sap
- power-bi
- sharepoint
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: production-reporting
port: 8080
tools:
- name: generate-production-report
description: "Given a mine site and date, aggregate production data and distribute reports."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site plant code."
- name: report_date
in: body
type: string
description: "Report date in YYYY-MM-DD format."
- name: bi_dataset_id
in: body
type: string
description: "Power BI dataset ID for production dashboard."
- name: bi_group_id
in: body
type: string
description: "Power BI workspace ID."
- name: ops_channel
in: body
type: string
description: "Microsoft Teams channel for operations."
steps:
- name: get-production-data
type: call
call: "sap.get-production-output"
with:
plant: "{{mine_site}}"
date: "{{report_date}}"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{bi_group_id}}"
dataset_id: "{{bi_dataset_id}}"
- name: upload-summary
type: call
call: "sharepoint.upload-file"
with:
site_id: "production_reports_site"
folder_path: "DailyReports/{{mine_site}}"
file_name: "production_{{mine_site}}_{{report_date}}.pdf"
- name: notify-ops
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{ops_channel}}"
text: "Production report for {{mine_site}} on {{report_date}}: Output {{get-production-data.total_tonnes}} tonnes. Dashboard refreshed. Report: {{upload-summary.url}}"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PRODUCTION_ORDER_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: production-orders
path: "/A_ProductionOrder?$filter=Plant eq '{{plant}}' and ProductionDate eq '{{date}}'"
inputParameters:
- name: plant
in: query
- name: date
in: query
operations:
- name: get-production-output
method: GET
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Triggers a Qlik Sense app reload for mining operations dashboards.
naftiko: "0.5"
info:
label: "Qlik Sense Mining Dashboard Reload"
description: "Triggers a Qlik Sense app reload for mining operations dashboards."
tags:
- analytics
- qlik-sense
- reporting
capability:
exposes:
- type: mcp
namespace: qlik-analytics
port: 8080
tools:
- name: reload-qlik-app
description: "Trigger a Qlik Sense app reload by app ID."
inputParameters:
- name: app_id
in: body
type: string
description: "The Qlik Sense app ID."
call: "qliksense.reload-app"
with:
app_id: "{{app_id}}"
consumes:
- type: http
namespace: qliksense
baseUri: "https://bhp.qlikcloud.com/api/v1"
authentication:
type: bearer
token: "$secrets.qlik_token"
resources:
- name: reloads
path: "/reloads"
operations:
- name: reload-app
method: POST
Queries train schedules from Snowflake, checks bin levels in SAP, updates dispatch plans in ServiceNow, and notifies rail operations via Microsoft Teams.
naftiko: "0.5"
info:
label: "Rail Loadout Coordination Pipeline"
description: "Queries train schedules from Snowflake, checks bin levels in SAP, updates dispatch plans in ServiceNow, and notifies rail operations via Microsoft Teams."
tags:
- rail-logistics
- loadout
- snowflake
- sap
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: rail-loadout
port: 8080
tools:
- name: coordinate-loadout
description: "Coordinate rail loadout across schedule, bins, dispatch, and notifications."
inputParameters:
- name: loadout_id
in: body
type: string
description: "The rail loadout facility ID."
- name: rail_channel
in: body
type: string
description: "Microsoft Teams rail operations channel."
steps:
- name: get-schedule
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM RAIL_SCHEDULE WHERE loadout_id = '{{loadout_id}}' AND scheduled_date = CURRENT_DATE()"
warehouse: "LOGISTICS_WH"
- name: check-bins
type: call
call: "sap.get-bin-levels"
with:
loadout_id: "{{loadout_id}}"
- name: update-dispatch
type: call
call: "servicenow.update-record"
with:
table: "u_rail_dispatch"
loadout_id: "{{loadout_id}}"
bin_level: "{{check-bins.current_level_pct}}"
- name: notify-rail-ops
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{rail_channel}}"
text: "Loadout {{loadout_id}}: Next train {{get-schedule.next_arrival}}. Bin level: {{check-bins.current_level_pct}}%. Trains today: {{get-schedule.train_count}}."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_LOADOUT_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: bins
path: "/A_BinLevel"
operations:
- name: get-bin-levels
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: update-record
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Queries reagent usage from Snowflake, optimizes dosing via Azure ML, updates procurement in SAP, and reports to the processing manager via Microsoft Teams.
naftiko: "0.5"
info:
label: "Reagent Consumption Optimization Pipeline"
description: "Queries reagent usage from Snowflake, optimizes dosing via Azure ML, updates procurement in SAP, and reports to the processing manager via Microsoft Teams."
tags:
- processing
- reagent-management
- snowflake
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: reagent-optimization
port: 8080
tools:
- name: optimize-reagents
description: "Analyze reagent usage, optimize dosing, update procurement, and report."
inputParameters:
- name: plant_id
in: body
type: string
description: "The processing plant identifier."
- name: processing_channel
in: body
type: string
description: "Microsoft Teams processing channel."
steps:
- name: get-usage
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM REAGENT_CONSUMPTION WHERE plant_id = '{{plant_id}}' AND consumption_date >= DATEADD(day, -7, CURRENT_DATE())"
warehouse: "OPERATIONS_WH"
- name: optimize-dosing
type: call
call: "azureml.score"
with:
model_type: "reagent_dosing"
data: "{{get-usage.results}}"
- name: update-procurement
type: call
call: "sap.update-material-requirement"
with:
plant_id: "{{plant_id}}"
reagent_id: "{{optimize-dosing.reagent_id}}"
optimized_rate: "{{optimize-dosing.optimal_rate_kg_h}}"
- name: report-savings
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{processing_channel}}"
text: "Reagent optimization for plant {{plant_id}}: Optimal rate {{optimize-dosing.optimal_rate_kg_h}} kg/h. Projected savings: ${{optimize-dosing.monthly_savings}}/month."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MATERIAL_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: materials
path: "/A_MaterialRequirement"
operations:
- name: update-material-requirement
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves cached values from Redis for fast lookup of real-time mine site telemetry and configuration data.
naftiko: "0.5"
info:
label: "Redis Cache Lookup"
description: "Retrieves cached values from Redis for fast lookup of real-time mine site telemetry and configuration data."
tags:
- caching
- redis
capability:
exposes:
- type: mcp
namespace: redis-cache
port: 8080
tools:
- name: get-value
description: "Look up a value in Redis by key."
inputParameters:
- name: key
in: body
type: string
description: "The Redis key to retrieve."
call: "redis.get-key"
with:
key: "{{key}}"
outputParameters:
- name: value
type: string
mapping: "$.value"
- name: ttl
type: integer
mapping: "$.ttl"
consumes:
- type: http
namespace: redis
baseUri: "https://redis.bhp.com:6380"
authentication:
type: apiKey
name: "Authorization"
in: header
value: "$secrets.redis_token"
resources:
- name: keys
path: "/get/{{key}}"
inputParameters:
- name: key
in: path
operations:
- name: get-key
method: GET
Ingests new drill data from S3, updates the block model in Snowflake, recalculates resource estimates via Azure ML, and publishes to SharePoint.
naftiko: "0.5"
info:
label: "Resource Model Update Pipeline"
description: "Ingests new drill data from S3, updates the block model in Snowflake, recalculates resource estimates via Azure ML, and publishes to SharePoint."
tags:
- resource-estimation
- block-model
- amazon-s3
- snowflake
- azure-machine-learning
- sharepoint
capability:
exposes:
- type: mcp
namespace: resource-model
port: 8080
tools:
- name: update-resource-model
description: "Ingest drill data, update block model, recalculate resources, and publish."
inputParameters:
- name: deposit_id
in: body
type: string
description: "The deposit identifier."
- name: bucket
in: body
type: string
description: "S3 bucket with drill data."
- name: data_key
in: body
type: string
description: "S3 key for drill data file."
steps:
- name: ingest-data
type: call
call: "s3.get-object"
with:
bucket: "{{bucket}}"
key: "{{data_key}}"
- name: update-block-model
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL UPDATE_BLOCK_MODEL('{{deposit_id}}')"
warehouse: "GEOLOGY_WH"
- name: estimate-resources
type: call
call: "azureml.score"
with:
model_type: "resource_estimation"
deposit_id: "{{deposit_id}}"
- name: publish-estimate
type: call
call: "sharepoint.upload-file"
with:
site_id: "geology_site"
folder_path: "ResourceModels/{{deposit_id}}"
file_name: "resource_estimate_{{deposit_id}}.pdf"
consumes:
- type: http
namespace: s3
baseUri: "https://{{bucket}}.s3.amazonaws.com"
authentication:
type: aws-sigv4
accessKeyId: "$secrets.aws_access_key"
secretAccessKey: "$secrets.aws_secret_key"
resources:
- name: objects
path: "/{{key}}"
inputParameters:
- name: bucket
in: path
- name: key
in: path
operations:
- name: get-object
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
When a safety incident is reported, creates a ServiceNow critical incident, logs it in SAP EHS, notifies the safety team via Microsoft Teams, and uploads the incident report to SharePoint.
naftiko: "0.5"
info:
label: "Safety Incident Reporting Pipeline"
description: "When a safety incident is reported, creates a ServiceNow critical incident, logs it in SAP EHS, notifies the safety team via Microsoft Teams, and uploads the incident report to SharePoint."
tags:
- safety
- incident
- servicenow
- sap
- microsoft-teams
- sharepoint
capability:
exposes:
- type: mcp
namespace: safety-incidents
port: 8080
tools:
- name: report-safety-incident
description: "Given safety incident details, orchestrate reporting across ServiceNow, SAP EHS, SharePoint, and Microsoft Teams."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site location code."
- name: incident_type
in: body
type: string
description: "Type of safety incident (e.g., near-miss, injury, environmental)."
- name: description
in: body
type: string
description: "Description of the incident."
- name: reporter_email
in: body
type: string
description: "Email of the person reporting the incident."
steps:
- name: create-snow-incident
type: call
call: "servicenow.create-incident"
with:
short_description: "Safety incident at {{mine_site}}: {{incident_type}}"
priority: "1"
category: "safety"
description: "{{description}}. Reported by: {{reporter_email}}."
- name: log-in-sap
type: call
call: "sap.create-ehs-incident"
with:
plant: "{{mine_site}}"
incident_type: "{{incident_type}}"
description: "{{description}}"
- name: upload-report
type: call
call: "sharepoint.upload-file"
with:
site_id: "safety_records_site"
folder_path: "Incidents/{{mine_site}}"
file_name: "incident_{{create-snow-incident.number}}.txt"
- name: notify-safety-team
type: call
call: "msteams.post-channel-message"
with:
channel_id: "safety-operations-channel"
text: "SAFETY ALERT at {{mine_site}}: {{incident_type}}. ServiceNow: {{create-snow-incident.number}}. SAP EHS: {{log-in-sap.incident_id}}. Report: {{upload-report.url}}"
consumes:
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_EHS_INCIDENT_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: ehs-incidents
path: "/A_EHSIncident"
operations:
- name: create-ehs-incident
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves a Salesforce account by ID, returning name, industry, annual revenue, and account owner for mining customer management.
naftiko: "0.5"
info:
label: "Salesforce Account Lookup"
description: "Retrieves a Salesforce account by ID, returning name, industry, annual revenue, and account owner for mining customer management."
tags:
- sales
- salesforce
- account
capability:
exposes:
- type: mcp
namespace: sales-crm
port: 8080
tools:
- name: get-account
description: "Look up a Salesforce account by ID."
inputParameters:
- name: account_id
in: body
type: string
description: "The Salesforce account ID."
call: "salesforce.get-account"
with:
account_id: "{{account_id}}"
outputParameters:
- name: name
type: string
mapping: "$.Name"
- name: industry
type: string
mapping: "$.Industry"
- name: annual_revenue
type: string
mapping: "$.AnnualRevenue"
- name: owner
type: string
mapping: "$.Owner.Name"
consumes:
- type: http
namespace: salesforce
baseUri: "https://bhp.my.salesforce.com/services/data/v58.0"
authentication:
type: bearer
token: "$secrets.salesforce_token"
resources:
- name: accounts
path: "/sobjects/Account/{{account_id}}"
inputParameters:
- name: account_id
in: path
operations:
- name: get-account
method: GET
Executes a report query against SAP BW for mining operations analysis including production volumes and cost metrics.
naftiko: "0.5"
info:
label: "SAP BW Mining Report Query"
description: "Executes a report query against SAP BW for mining operations analysis including production volumes and cost metrics."
tags:
- analytics
- sap-bw
- reporting
- mining
capability:
exposes:
- type: mcp
namespace: bw-reporting
port: 8080
tools:
- name: run-bw-query
description: "Execute a SAP BW query for mining analytics."
inputParameters:
- name: query_name
in: body
type: string
description: "The SAP BW query technical name."
- name: mine_site
in: body
type: string
description: "Mine site filter parameter."
- name: period
in: body
type: string
description: "Reporting period (e.g., 2026.01)."
call: "sap-bw.execute-query"
with:
query: "{{query_name}}"
plant: "{{mine_site}}"
period: "{{period}}"
consumes:
- type: http
namespace: sap-bw
baseUri: "https://bhp-bw.sap.com/sap/opu/odata/sap/API_BW_QUERY_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: queries
path: "/QueryResults?queryName={{query}}&plant={{plant}}&period={{period}}"
inputParameters:
- name: query
in: query
- name: plant
in: query
- name: period
in: query
operations:
- name: execute-query
method: GET
Retrieves equipment status from SAP Plant Maintenance for mine site heavy machinery and processing plant assets.
naftiko: "0.5"
info:
label: "SAP Equipment Status Lookup"
description: "Retrieves equipment status from SAP Plant Maintenance for mine site heavy machinery and processing plant assets."
tags:
- equipment
- sap
capability:
exposes:
- type: mcp
namespace: sap-equipment
port: 8080
tools:
- name: get-equipment-status
description: "Retrieve equipment status from SAP PM."
inputParameters:
- name: equipment_id
in: body
type: string
description: "The SAP equipment number."
call: "sap.get-equipment"
with:
equipment_id: "{{equipment_id}}"
outputParameters:
- name: status
type: string
mapping: "$.d.UserStatus"
- name: description
type: string
mapping: "$.d.Description"
- name: last_maintenance
type: string
mapping: "$.d.LastMaintenanceDate"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_EQUIPMENT"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: equipment
path: "/Equipment('{{equipment_id}}')"
inputParameters:
- name: equipment_id
in: path
operations:
- name: get-equipment
method: GET
Retrieves SAP material master data by material number, returning description, unit of measure, and material group for mining supplies.
naftiko: "0.5"
info:
label: "SAP Material Master Lookup"
description: "Retrieves SAP material master data by material number, returning description, unit of measure, and material group for mining supplies."
tags:
- erp
- sap
- materials
- mining
capability:
exposes:
- type: mcp
namespace: erp-materials
port: 8080
tools:
- name: get-material
description: "Look up a SAP material master record by material number."
inputParameters:
- name: material_number
in: body
type: string
description: "The SAP material number."
call: "sap.get-material"
with:
material_number: "{{material_number}}"
outputParameters:
- name: description
type: string
mapping: "$.d.MaterialDescription"
- name: unit_of_measure
type: string
mapping: "$.d.BaseUnit"
- name: material_group
type: string
mapping: "$.d.MaterialGroup"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_PRODUCT_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: materials
path: "/A_Product('{{material_number}}')"
inputParameters:
- name: material_number
in: path
operations:
- name: get-material
method: GET
Retrieves a SAP PM work order by order number, returning status, equipment, maintenance type, and planned dates for mine equipment.
naftiko: "0.5"
info:
label: "SAP Plant Maintenance Work Order"
description: "Retrieves a SAP PM work order by order number, returning status, equipment, maintenance type, and planned dates for mine equipment."
tags:
- maintenance
- sap
- equipment
- mining
capability:
exposes:
- type: mcp
namespace: plant-maintenance
port: 8080
tools:
- name: get-work-order
description: "Look up a SAP PM work order by order number."
inputParameters:
- name: order_number
in: body
type: string
description: "The SAP maintenance order number."
call: "sap.get-maintenance-order"
with:
order_number: "{{order_number}}"
outputParameters:
- name: status
type: string
mapping: "$.d.OrderStatus"
- name: equipment
type: string
mapping: "$.d.Equipment"
- name: maintenance_type
type: string
mapping: "$.d.MaintenanceType"
- name: planned_start
type: string
mapping: "$.d.PlannedStartDate"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: maintenance-orders
path: "/MaintenanceOrder('{{order_number}}')"
inputParameters:
- name: order_number
in: path
operations:
- name: get-maintenance-order
method: GET
Looks up a SAP S/4HANA purchase order by number and returns header status, vendor, total value, and currency for mining procurement.
naftiko: "0.5"
info:
label: "SAP Purchase Order Status"
description: "Looks up a SAP S/4HANA purchase order by number and returns header status, vendor, total value, and currency for mining procurement."
tags:
- procurement
- erp
- sap
- mining
capability:
exposes:
- type: mcp
namespace: erp-procurement
port: 8080
tools:
- name: get-purchase-order
description: "Look up a SAP S/4HANA purchase order by PO number."
inputParameters:
- name: po_number
in: body
type: string
description: "The SAP purchase order number (10-digit)."
call: "sap.get-po"
with:
po_number: "{{po_number}}"
outputParameters:
- name: status
type: string
mapping: "$.d.OverallStatus"
- name: vendor
type: string
mapping: "$.d.Supplier.CompanyName"
- name: total_value
type: string
mapping: "$.d.TotalAmount"
- name: currency
type: string
mapping: "$.d.TransactionCurrency"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/MM_PUR_PO_MAINT_V2_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: purchase-orders
path: "/A_PurchaseOrder('{{po_number}}')"
inputParameters:
- name: po_number
in: path
operations:
- name: get-po
method: GET
Collects SCADA alarm floods from Datadog, prioritizes via Azure ML, creates work orders in SAP, and notifies the control room via Slack.
naftiko: "0.5"
info:
label: "SCADA Alarm Triage Pipeline"
description: "Collects SCADA alarm floods from Datadog, prioritizes via Azure ML, creates work orders in SAP, and notifies the control room via Slack."
tags:
- scada
- alarm-management
- datadog
- azure-machine-learning
- sap
- slack
capability:
exposes:
- type: mcp
namespace: scada-triage
port: 8080
tools:
- name: triage-alarms
description: "Triage SCADA alarms, prioritize, create work orders, and notify."
inputParameters:
- name: plant_id
in: body
type: string
description: "The processing plant identifier."
- name: slack_channel
in: body
type: string
description: "Slack channel for control room."
steps:
- name: get-alarms
type: call
call: "datadog.query-metrics"
with:
query: "sum:scada.alarms{plant:{{plant_id}}} by {tag}"
from: "-1h"
- name: prioritize
type: call
call: "azureml.score"
with:
model_type: "alarm_prioritization"
data: "{{get-alarms.series}}"
- name: create-work-orders
type: call
call: "sap.create-work-order"
with:
order_type: "CORRECTIVE"
plant_id: "{{plant_id}}"
description: "SCADA alarm: {{prioritize.top_alarm}}. Priority: {{prioritize.priority}}."
- name: notify-control-room
type: call
call: "slack.post-message"
with:
channel: "{{slack_channel}}"
text: "SCADA triage plant {{plant_id}}: {{prioritize.total_alarms}} alarms. Critical: {{prioritize.critical_count}}. Top: {{prioritize.top_alarm}}. WO: {{create-work-orders.order_id}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: work-orders
path: "/MaintenanceOrder"
operations:
- name: create-work-order
method: POST
- type: http
namespace: slack
baseUri: "https://slack.com/api"
authentication:
type: bearer
token: "$secrets.slack_bot_token"
resources:
- name: messages
path: "/chat.postMessage"
operations:
- name: post-message
method: POST
Retrieves a ServiceNow incident by number, returning state, priority, assigned group, and short description.
naftiko: "0.5"
info:
label: "ServiceNow Incident Lookup"
description: "Retrieves a ServiceNow incident by number, returning state, priority, assigned group, and short description."
tags:
- itsm
- servicenow
- incident
capability:
exposes:
- type: mcp
namespace: itsm-incidents
port: 8080
tools:
- name: get-incident
description: "Look up a ServiceNow incident by number."
inputParameters:
- name: incident_number
in: body
type: string
description: "The ServiceNow incident number (e.g., INC0012345)."
call: "servicenow.get-incident"
with:
incident_number: "{{incident_number}}"
outputParameters:
- name: state
type: string
mapping: "$.result.state"
- name: priority
type: string
mapping: "$.result.priority"
- name: assigned_group
type: string
mapping: "$.result.assignment_group.display_value"
- name: short_description
type: string
mapping: "$.result.short_description"
consumes:
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident?sysparm_query=number={{incident_number}}"
inputParameters:
- name: incident_number
in: query
operations:
- name: get-incident
method: GET
Retrieves metadata for a SharePoint document by path, returning file name, size, last modified date, and download URL.
naftiko: "0.5"
info:
label: "SharePoint Document Retrieval"
description: "Retrieves metadata for a SharePoint document by path, returning file name, size, last modified date, and download URL."
tags:
- collaboration
- sharepoint
- documents
capability:
exposes:
- type: mcp
namespace: doc-management
port: 8080
tools:
- name: get-document
description: "Look up a SharePoint document by site and path."
inputParameters:
- name: site_id
in: body
type: string
description: "The SharePoint site ID."
- name: file_path
in: body
type: string
description: "The path to the file."
call: "sharepoint.get-file"
with:
site_id: "{{site_id}}"
file_path: "{{file_path}}"
outputParameters:
- name: file_name
type: string
mapping: "$.name"
- name: size
type: integer
mapping: "$.size"
- name: last_modified
type: string
mapping: "$.lastModifiedDateTime"
- name: download_url
type: string
mapping: "$.@microsoft.graph.downloadUrl"
consumes:
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{file_path}}"
inputParameters:
- name: site_id
in: path
- name: file_path
in: path
operations:
- name: get-file
method: GET
Compiles shift production data from Snowflake, safety incidents from ServiceNow, equipment status from SAP, and distributes the handover report via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Shift Handover Report Orchestrator"
description: "Compiles shift production data from Snowflake, safety incidents from ServiceNow, equipment status from SAP, and distributes the handover report via Microsoft Outlook."
tags:
- shift-management
- reporting
- snowflake
- servicenow
- sap
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: shift-handover
port: 8080
tools:
- name: generate-handover
description: "Compile shift data and distribute handover report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: shift
in: body
type: string
description: "Shift identifier."
- name: incoming_shift_email
in: body
type: string
description: "Incoming shift supervisor email."
steps:
- name: get-production
type: call
call: "snowflake.execute-statement"
with:
statement: "CALL GET_SHIFT_PRODUCTION('{{mine_site}}', '{{shift}}')"
warehouse: "OPERATIONS_WH"
- name: get-incidents
type: call
call: "servicenow.get-records"
with:
table: "incident"
query: "mine_site={{mine_site}}^shift={{shift}}^sys_created_on>=javascript:gs.beginningOfToday()"
- name: get-equipment
type: call
call: "sap.get-equipment-status"
with:
mine_site: "{{mine_site}}"
- name: send-handover
type: call
call: "outlook.send-mail"
with:
to: "{{incoming_shift_email}}"
subject: "Shift Handover: {{mine_site}} - {{shift}}"
body: "Production: {{get-production.total_tonnes}}t. Target: {{get-production.target_pct}}%. Incidents: {{get-incidents.count}}. Equipment available: {{get-equipment.availability_pct}}%."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: get-records
method: GET
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_EQUIPMENT"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: equipment
path: "/Equipment"
operations:
- name: get-equipment-status
method: GET
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Posts a message to a Slack channel for mining operations team communications and automated notifications.
naftiko: "0.5"
info:
label: "Slack Channel Message"
description: "Posts a message to a Slack channel for mining operations team communications and automated notifications."
tags:
- messaging
- slack
capability:
exposes:
- type: mcp
namespace: slack
port: 8080
tools:
- name: post-message
description: "Post a message to a Slack channel."
inputParameters:
- name: channel
in: body
type: string
description: "The Slack channel ID."
- name: text
in: body
type: string
description: "Message text."
call: "slack.post-message"
with:
channel: "{{channel}}"
text: "{{text}}"
outputParameters:
- name: ts
type: string
mapping: "$.ts"
- name: ok
type: boolean
mapping: "$.ok"
consumes:
- type: http
namespace: slack
baseUri: "https://slack.com/api"
authentication:
type: bearer
token: "$secrets.slack_bot_token"
resources:
- name: messages
path: "/chat.postMessage"
operations:
- name: post-message
method: POST
Collects pit slope radar data from Datadog, analyzes displacement trends via Azure ML, creates safety alerts in ServiceNow, and notifies geotechnical engineers via Microsoft Teams.
naftiko: "0.5"
info:
label: "Slope Stability Monitoring Pipeline"
description: "Collects pit slope radar data from Datadog, analyzes displacement trends via Azure ML, creates safety alerts in ServiceNow, and notifies geotechnical engineers via Microsoft Teams."
tags:
- geotechnical
- slope-stability
- datadog
- azure-machine-learning
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: slope-stability
port: 8080
tools:
- name: monitor-slope
description: "Monitor slope displacement, analyze trends, alert safety, and notify engineers."
inputParameters:
- name: pit_id
in: body
type: string
description: "The open pit identifier."
- name: geotech_channel
in: body
type: string
description: "Microsoft Teams geotechnical channel."
steps:
- name: get-radar-data
type: call
call: "datadog.query-metrics"
with:
query: "avg:slope.displacement{pit:{{pit_id}}} by {prism}"
from: "-24h"
- name: analyze-displacement
type: call
call: "azureml.score"
with:
model_type: "slope_displacement"
data: "{{get-radar-data.series}}"
- name: create-safety-alert
type: call
call: "servicenow.create-incident"
with:
short_description: "Slope displacement alert: pit {{pit_id}}"
description: "Max displacement: {{analyze-displacement.max_displacement_mm}}mm. Rate: {{analyze-displacement.displacement_rate}}mm/day. Risk: {{analyze-displacement.risk_level}}."
urgency: "1"
- name: notify-geotech
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{geotech_channel}}"
text: "SLOPE ALERT pit {{pit_id}}: Displacement {{analyze-displacement.max_displacement_mm}}mm at {{analyze-displacement.displacement_rate}}mm/day. Risk: {{analyze-displacement.risk_level}}. Incident: {{create-safety-alert.number}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Executes a SQL statement against the BHP Snowflake data warehouse for mining operations analytics.
naftiko: "0.5"
info:
label: "Snowflake Mining Analytics Query"
description: "Executes a SQL statement against the BHP Snowflake data warehouse for mining operations analytics."
tags:
- data
- analytics
- snowflake
- mining
capability:
exposes:
- type: mcp
namespace: data-analytics
port: 8080
tools:
- name: run-snowflake-query
description: "Execute a SQL query against the BHP Snowflake warehouse."
inputParameters:
- name: sql_statement
in: body
type: string
description: "The SQL statement to execute."
- name: warehouse
in: body
type: string
description: "The Snowflake warehouse name."
call: "snowflake.execute-statement"
with:
statement: "{{sql_statement}}"
warehouse: "{{warehouse}}"
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
Retrieves network node performance data from SolarWinds for mine site network infrastructure monitoring.
naftiko: "0.5"
info:
label: "SolarWinds Network Performance Check"
description: "Retrieves network node performance data from SolarWinds for mine site network infrastructure monitoring."
tags:
- monitoring
- solarwinds
- networking
capability:
exposes:
- type: mcp
namespace: network-monitoring
port: 8080
tools:
- name: get-node-performance
description: "Look up SolarWinds node performance by node ID."
inputParameters:
- name: node_id
in: body
type: string
description: "The SolarWinds node ID."
call: "solarwinds.get-node"
with:
node_id: "{{node_id}}"
outputParameters:
- name: status
type: string
mapping: "$.Status"
- name: cpu_load
type: string
mapping: "$.CPULoad"
- name: memory_used
type: string
mapping: "$.PercentMemoryUsed"
- name: response_time
type: string
mapping: "$.ResponseTime"
consumes:
- type: http
namespace: solarwinds
baseUri: "https://bhp-solarwinds.bhp.com/SolarWinds/InformationService/v3/Json"
authentication:
type: basic
username: "$secrets.solarwinds_user"
password: "$secrets.solarwinds_password"
resources:
- name: nodes
path: "/Query?query=SELECT+Status,CPULoad,PercentMemoryUsed,ResponseTime+FROM+Orion.Nodes+WHERE+NodeID={{node_id}}"
inputParameters:
- name: node_id
in: query
operations:
- name: get-node
method: GET
Retrieves a Sparx EA model package for mine operations architecture documentation.
naftiko: "0.5"
info:
label: "Sparx Enterprise Architect Model Export"
description: "Retrieves a Sparx EA model package for mine operations architecture documentation."
tags:
- architecture
- sparx-enterprise-architect
- documentation
capability:
exposes:
- type: mcp
namespace: ea-models
port: 8080
tools:
- name: get-ea-package
description: "Look up a Sparx EA model package by ID."
inputParameters:
- name: package_id
in: body
type: string
description: "The EA package GUID."
call: "sparxea.get-package"
with:
package_id: "{{package_id}}"
consumes:
- type: http
namespace: sparxea
baseUri: "https://bhp-ea.bhp.com/api/v1"
authentication:
type: bearer
token: "$secrets.sparxea_token"
resources:
- name: packages
path: "/packages/{{package_id}}"
inputParameters:
- name: package_id
in: path
operations:
- name: get-package
method: GET
When a shipping order is created in SAP, updates the logistics tracker in Snowflake, creates a ServiceNow tracking ticket, and notifies the logistics team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Supply Chain Logistics Pipeline"
description: "When a shipping order is created in SAP, updates the logistics tracker in Snowflake, creates a ServiceNow tracking ticket, and notifies the logistics team via Microsoft Teams."
tags:
- supply-chain
- logistics
- sap
- snowflake
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: supply-chain
port: 8080
tools:
- name: process-shipping-order
description: "Given a SAP shipping order, update tracking systems and notify logistics."
inputParameters:
- name: shipping_order_id
in: body
type: string
description: "The SAP shipping order number."
- name: logistics_channel
in: body
type: string
description: "Microsoft Teams channel for logistics updates."
steps:
- name: get-shipping-order
type: call
call: "sap.get-delivery"
with:
delivery_number: "{{shipping_order_id}}"
- name: update-tracker
type: call
call: "snowflake.execute-statement"
with:
statement: "INSERT INTO LOGISTICS_TRACKING (delivery_id, destination, commodity, weight_tonnes, status) VALUES ('{{shipping_order_id}}', '{{get-shipping-order.destination}}', '{{get-shipping-order.material}}', '{{get-shipping-order.weight}}', 'SHIPPED')"
warehouse: "LOGISTICS_WH"
- name: create-tracking-ticket
type: call
call: "servicenow.create-incident"
with:
short_description: "Shipping order {{shipping_order_id}} dispatched"
category: "logistics"
description: "Commodity: {{get-shipping-order.material}}. Destination: {{get-shipping-order.destination}}. Weight: {{get-shipping-order.weight}} tonnes."
- name: notify-logistics
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{logistics_channel}}"
text: "Shipment dispatched: {{shipping_order_id}}. {{get-shipping-order.material}} to {{get-shipping-order.destination}} ({{get-shipping-order.weight}}t). Tracking: {{create-tracking-ticket.number}}."
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_OUTBOUND_DELIVERY_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: deliveries
path: "/A_OutbDeliveryHeader('{{delivery_number}}')"
inputParameters:
- name: delivery_number
in: path
operations:
- name: get-delivery
method: GET
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: incidents
path: "/table/incident"
operations:
- name: create-incident
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Aggregates sustainability KPIs from SAP, refreshes the Power BI sustainability dashboard, uploads the report to SharePoint, and emails stakeholders via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Sustainability Metrics Reporting Pipeline"
description: "Aggregates sustainability KPIs from SAP, refreshes the Power BI sustainability dashboard, uploads the report to SharePoint, and emails stakeholders via Microsoft Outlook."
tags:
- sustainability
- reporting
- sap
- power-bi
- sharepoint
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: sustainability-reporting
port: 8080
tools:
- name: generate-sustainability-report
description: "Given a reporting period, aggregate sustainability metrics and distribute to stakeholders."
inputParameters:
- name: period_start
in: body
type: string
description: "Reporting period start date (YYYY-MM-DD)."
- name: period_end
in: body
type: string
description: "Reporting period end date (YYYY-MM-DD)."
- name: bi_dataset_id
in: body
type: string
description: "Power BI dataset ID for sustainability dashboard."
- name: bi_group_id
in: body
type: string
description: "Power BI workspace ID."
- name: stakeholder_emails
in: body
type: string
description: "Comma-separated email addresses."
steps:
- name: get-kpis
type: call
call: "sap.get-sustainability-kpis"
with:
start_date: "{{period_start}}"
end_date: "{{period_end}}"
- name: refresh-dashboard
type: call
call: "powerbi.refresh-dataset"
with:
group_id: "{{bi_group_id}}"
dataset_id: "{{bi_dataset_id}}"
- name: upload-report
type: call
call: "sharepoint.upload-file"
with:
site_id: "sustainability_reports_site"
folder_path: "Reports/{{period_start}}_{{period_end}}"
file_name: "sustainability_report.pdf"
- name: email-stakeholders
type: call
call: "outlook.send-mail"
with:
to: "{{stakeholder_emails}}"
subject: "BHP Sustainability Report: {{period_start}} to {{period_end}}"
body: "Sustainability report available. CO2 emissions: {{get-kpis.co2_tonnes}}t. Water usage: {{get-kpis.water_megalitres}}ML. Report: {{upload-report.url}}"
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_SUSTAINABILITY_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: sustainability-kpis
path: "/A_SustainabilityKPI?$filter=Date ge '{{start_date}}' and Date le '{{end_date}}'"
inputParameters:
- name: start_date
in: query
- name: end_date
in: query
operations:
- name: get-sustainability-kpis
method: GET
- type: http
namespace: powerbi
baseUri: "https://api.powerbi.com/v1.0/myorg"
authentication:
type: bearer
token: "$secrets.powerbi_token"
resources:
- name: datasets
path: "/groups/{{group_id}}/datasets/{{dataset_id}}/refreshes"
inputParameters:
- name: group_id
in: path
- name: dataset_id
in: path
operations:
- name: refresh-dataset
method: POST
- type: http
namespace: sharepoint
baseUri: "https://graph.microsoft.com/v1.0/sites"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: files
path: "/{{site_id}}/drive/root:/{{folder_path}}/{{file_name}}:/content"
inputParameters:
- name: site_id
in: path
- name: folder_path
in: path
- name: file_name
in: path
operations:
- name: upload-file
method: PUT
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Collects tailings dam sensor data from Datadog, runs structural stability predictions via Azure ML, updates SAP asset records, and alerts the dam safety team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Tailings Dam Monitoring Pipeline"
description: "Collects tailings dam sensor data from Datadog, runs structural stability predictions via Azure ML, updates SAP asset records, and alerts the dam safety team via Microsoft Teams."
tags:
- tailings-management
- dam-safety
- datadog
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: tailings-dam
port: 8080
tools:
- name: monitor-tailings-dam
description: "Monitor dam sensors, predict stability, update records, and alert safety team."
inputParameters:
- name: dam_id
in: body
type: string
description: "The tailings dam identifier."
- name: safety_channel
in: body
type: string
description: "Microsoft Teams dam safety channel."
steps:
- name: get-sensor-data
type: call
call: "datadog.query-metrics"
with:
query: "avg:dam.piezometer{dam_id:{{dam_id}}} by {sensor}"
from: "-6h"
- name: predict-stability
type: call
call: "azureml.score"
with:
model_type: "dam_stability"
data: "{{get-sensor-data.series}}"
- name: update-asset
type: call
call: "sap.update-equipment"
with:
equipment_id: "{{dam_id}}"
stability_score: "{{predict-stability.factor_of_safety}}"
risk_level: "{{predict-stability.risk_level}}"
- name: alert-safety
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{safety_channel}}"
text: "Tailings dam {{dam_id}}: Factor of safety {{predict-stability.factor_of_safety}}. Risk: {{predict-stability.risk_level}}. Pore pressure trend: {{predict-stability.pressure_trend}}."
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_EQUIPMENT"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: equipment
path: "/Equipment('{{equipment_id}}')"
inputParameters:
- name: equipment_id
in: path
operations:
- name: update-equipment
method: PATCH
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Executes a query against the Teradata geological data warehouse, returning drill core assay and survey results.
naftiko: "0.5"
info:
label: "Teradata Geological Query"
description: "Executes a query against the Teradata geological data warehouse, returning drill core assay and survey results."
tags:
- geology
- teradata
capability:
exposes:
- type: mcp
namespace: teradata-geology
port: 8080
tools:
- name: query-geology
description: "Execute a geological data query against Teradata."
inputParameters:
- name: query
in: body
type: string
description: "The SQL query to execute."
call: "teradata.execute-query"
with:
query: "{{query}}"
outputParameters:
- name: rows
type: array
mapping: "$.results"
- name: row_count
type: integer
mapping: "$.rowCount"
consumes:
- type: http
namespace: teradata
baseUri: "https://teradata.bhp.com/api/v1"
authentication:
type: bearer
token: "$secrets.teradata_token"
resources:
- name: queries
path: "/queries"
operations:
- name: execute-query
method: POST
Executes a SQL query against the BHP Teradata warehouse for geological and operational mining data analysis.
naftiko: "0.5"
info:
label: "Teradata Mining Data Query"
description: "Executes a SQL query against the BHP Teradata warehouse for geological and operational mining data analysis."
tags:
- data
- analytics
- teradata
- mining
capability:
exposes:
- type: mcp
namespace: dw-analytics
port: 8080
tools:
- name: run-teradata-query
description: "Execute a SQL query against the Teradata warehouse."
inputParameters:
- name: sql_statement
in: body
type: string
description: "The SQL statement to execute."
- name: database_name
in: body
type: string
description: "The target Teradata database."
call: "teradata.execute-query"
with:
query: "{{sql_statement}}"
database: "{{database_name}}"
consumes:
- type: http
namespace: teradata
baseUri: "https://bhp-td.teradata.com/api/v1"
authentication:
type: bearer
token: "$secrets.teradata_token"
resources:
- name: queries
path: "/queries"
operations:
- name: execute-query
method: POST
Monitors haul truck tire pressure from Datadog, predicts blowout risk via Azure ML, schedules tire changes in SAP, and alerts fleet management via Microsoft Teams.
naftiko: "0.5"
info:
label: "Tire Pressure Monitoring Pipeline"
description: "Monitors haul truck tire pressure from Datadog, predicts blowout risk via Azure ML, schedules tire changes in SAP, and alerts fleet management via Microsoft Teams."
tags:
- tire-management
- predictive-maintenance
- datadog
- azure-machine-learning
- sap
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: tire-monitoring
port: 8080
tools:
- name: monitor-tires
description: "Monitor tire pressure, predict blowout risk, schedule changes, and alert."
inputParameters:
- name: fleet_id
in: body
type: string
description: "The haul truck fleet identifier."
- name: fleet_channel
in: body
type: string
description: "Microsoft Teams fleet management channel."
steps:
- name: get-pressure
type: call
call: "datadog.query-metrics"
with:
query: "avg:truck.tire.pressure{fleet:{{fleet_id}}} by {truck,position}"
from: "-12h"
- name: predict-risk
type: call
call: "azureml.score"
with:
model_type: "tire_blowout"
data: "{{get-pressure.series}}"
- name: schedule-change
type: call
call: "sap.create-work-order"
with:
order_type: "PREVENTIVE"
fleet_id: "{{fleet_id}}"
description: "Tire replacement: {{predict-risk.at_risk_count}} tires at risk"
- name: alert-fleet
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{fleet_channel}}"
text: "Tire alert fleet {{fleet_id}}: {{predict-risk.at_risk_count}} tires at risk. Highest: truck {{predict-risk.highest_risk_truck}}, position {{predict-risk.position}}. WO: {{schedule-change.order_id}}"
consumes:
- type: http
namespace: datadog
baseUri: "https://api.datadoghq.com/api/v1"
authentication:
type: apiKey
name: "DD-API-KEY"
in: header
value: "$secrets.datadog_api_key"
resources:
- name: metrics
path: "/query"
operations:
- name: query-metrics
method: GET
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_MAINTORDER"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: work-orders
path: "/MaintenanceOrder"
operations:
- name: create-work-order
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Retrieves the latest trade details from Tradeweb for BHP corporate bonds.
naftiko: "0.5"
info:
label: "Tradeweb Bond Trading Lookup"
description: "Retrieves the latest trade details from Tradeweb for BHP corporate bonds."
tags:
- finance
- tradeweb
- trading
capability:
exposes:
- type: mcp
namespace: bond-trading
port: 8080
tools:
- name: get-trade-details
description: "Look up the latest Tradeweb trade by instrument ID."
inputParameters:
- name: instrument_id
in: body
type: string
description: "The Tradeweb instrument identifier."
call: "tradeweb.get-trade"
with:
instrument_id: "{{instrument_id}}"
outputParameters:
- name: price
type: string
mapping: "$.trade.price"
- name: yield
type: string
mapping: "$.trade.yield"
- name: trade_date
type: string
mapping: "$.trade.tradeDate"
consumes:
- type: http
namespace: tradeweb
baseUri: "https://api.tradeweb.com/v1"
authentication:
type: bearer
token: "$secrets.tradeweb_token"
resources:
- name: trades
path: "/trades?instrumentId={{instrument_id}}&limit=1"
inputParameters:
- name: instrument_id
in: query
operations:
- name: get-trade
method: GET
Sends an SMS alert to mine site personnel using Twilio for critical operational notifications.
naftiko: "0.5"
info:
label: "Twilio SMS Alert"
description: "Sends an SMS alert to mine site personnel using Twilio for critical operational notifications."
tags:
- notifications
- twilio
capability:
exposes:
- type: mcp
namespace: twilio-sms
port: 8080
tools:
- name: send-sms
description: "Send an SMS alert to a phone number."
inputParameters:
- name: to
in: body
type: string
description: "Recipient phone number in E.164 format."
- name: message
in: body
type: string
description: "The SMS message body."
call: "twilio.send-message"
with:
to: "{{to}}"
from: "$secrets.twilio_from_number"
body: "{{message}}"
outputParameters:
- name: message_sid
type: string
mapping: "$.sid"
- name: status
type: string
mapping: "$.status"
consumes:
- type: http
namespace: twilio
baseUri: "https://api.twilio.com/2010-04-01/Accounts/$secrets.twilio_account_sid"
authentication:
type: basic
username: "$secrets.twilio_account_sid"
password: "$secrets.twilio_auth_token"
resources:
- name: messages
path: "/Messages.json"
operations:
- name: send-message
method: POST
When a vendor invoice arrives in SAP, validates it against the PO, creates a ServiceNow approval request, and notifies the finance team via Microsoft Teams.
naftiko: "0.5"
info:
label: "Vendor Invoice Processing Pipeline"
description: "When a vendor invoice arrives in SAP, validates it against the PO, creates a ServiceNow approval request, and notifies the finance team via Microsoft Teams."
tags:
- finance
- invoice
- sap
- servicenow
- microsoft-teams
capability:
exposes:
- type: mcp
namespace: invoice-processing
port: 8080
tools:
- name: process-vendor-invoice
description: "Given a SAP invoice number, validate against PO and route for approval."
inputParameters:
- name: invoice_number
in: body
type: string
description: "The SAP invoice number."
- name: finance_channel
in: body
type: string
description: "Microsoft Teams channel for finance approvals."
steps:
- name: get-invoice
type: call
call: "sap.get-invoice"
with:
invoice_number: "{{invoice_number}}"
- name: get-po
type: call
call: "sap.get-po"
with:
po_number: "{{get-invoice.po_reference}}"
- name: create-approval
type: call
call: "servicenow.create-request"
with:
short_description: "Invoice approval: {{invoice_number}} (PO: {{get-invoice.po_reference}})"
category: "finance_approval"
description: "Invoice {{invoice_number}} for {{get-invoice.amount}} {{get-invoice.currency}} from {{get-po.vendor}}. PO total: {{get-po.total_value}}."
- name: notify-finance
type: call
call: "msteams.post-channel-message"
with:
channel_id: "{{finance_channel}}"
text: "Invoice {{invoice_number}} requires approval. Amount: {{get-invoice.amount}} {{get-invoice.currency}}. Vendor: {{get-po.vendor}}. ServiceNow: {{create-approval.number}}."
consumes:
- type: http
namespace: sap
baseUri: "https://bhp-s4.sap.com/sap/opu/odata/sap/API_SUPPLIERINVOICE_PROCESS_SRV"
authentication:
type: basic
username: "$secrets.sap_user"
password: "$secrets.sap_password"
resources:
- name: invoices
path: "/A_SupplierInvoice('{{invoice_number}}')"
inputParameters:
- name: invoice_number
in: path
operations:
- name: get-invoice
method: GET
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: requests
path: "/table/sc_request"
operations:
- name: create-request
method: POST
- type: http
namespace: msteams
baseUri: "https://graph.microsoft.com/v1.0"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: channel-messages
path: "/teams/{{channel_id}}/channels/general/messages"
inputParameters:
- name: channel_id
in: path
operations:
- name: post-channel-message
method: POST
Aggregates water usage data from Snowflake, forecasts demand via Azure ML, updates water permits in ServiceNow, and distributes the water balance report via Microsoft Outlook.
naftiko: "0.5"
info:
label: "Water Balance Management Pipeline"
description: "Aggregates water usage data from Snowflake, forecasts demand via Azure ML, updates water permits in ServiceNow, and distributes the water balance report via Microsoft Outlook."
tags:
- water-management
- sustainability
- snowflake
- azure-machine-learning
- servicenow
- microsoft-outlook
capability:
exposes:
- type: mcp
namespace: water-balance
port: 8080
tools:
- name: manage-water-balance
description: "Aggregate water data, forecast demand, update permits, and distribute report."
inputParameters:
- name: mine_site
in: body
type: string
description: "The mine site code."
- name: env_email
in: body
type: string
description: "Environmental team distribution list."
steps:
- name: get-water-data
type: call
call: "snowflake.execute-statement"
with:
statement: "SELECT * FROM WATER_BALANCE WHERE mine_site = '{{mine_site}}' AND period >= DATEADD(month, -1, CURRENT_DATE())"
warehouse: "ENV_WH"
- name: forecast-demand
type: call
call: "azureml.score"
with:
model_type: "water_demand_forecast"
data: "{{get-water-data.results}}"
- name: update-permit
type: call
call: "servicenow.update-record"
with:
table: "u_water_permit"
mine_site: "{{mine_site}}"
current_usage_ml: "{{forecast-demand.current_usage_ml}}"
forecast_ml: "{{forecast-demand.next_month_forecast_ml}}"
- name: distribute-report
type: call
call: "outlook.send-mail"
with:
to: "{{env_email}}"
subject: "Water Balance Report: {{mine_site}}"
body: "Current usage: {{forecast-demand.current_usage_ml}} ML. Forecast: {{forecast-demand.next_month_forecast_ml}} ML. Permit utilization: {{forecast-demand.permit_utilization_pct}}%."
consumes:
- type: http
namespace: snowflake
baseUri: "https://bhp.snowflakecomputing.com/api/v2"
authentication:
type: bearer
token: "$secrets.snowflake_token"
resources:
- name: statements
path: "/statements"
operations:
- name: execute-statement
method: POST
- type: http
namespace: azureml
baseUri: "https://bhp-ml.australiaeast.inference.ml.azure.com"
authentication:
type: bearer
token: "$secrets.azureml_token"
resources:
- name: scoring
path: "/score"
operations:
- name: score
method: POST
- type: http
namespace: servicenow
baseUri: "https://bhp.service-now.com/api/now"
authentication:
type: basic
username: "$secrets.servicenow_user"
password: "$secrets.servicenow_password"
resources:
- name: records
path: "/table/{{table}}"
inputParameters:
- name: table
in: path
operations:
- name: update-record
method: PATCH
- type: http
namespace: outlook
baseUri: "https://graph.microsoft.com/v1.0/me"
authentication:
type: bearer
token: "$secrets.msgraph_token"
resources:
- name: mail
path: "/sendMail"
operations:
- name: send-mail
method: POST
Retrieves employee details from Workday by worker ID, returning name, department, title, and work email.
naftiko: "0.5"
info:
label: "Workday Employee Directory Lookup"
description: "Retrieves employee details from Workday by worker ID, returning name, department, title, and work email."
tags:
- hr
- workday
- employee
capability:
exposes:
- type: mcp
namespace: hr-directory
port: 8080
tools:
- name: get-employee
description: "Look up an employee in Workday by worker ID."
inputParameters:
- name: worker_id
in: body
type: string
description: "The Workday worker ID."
call: "workday.get-worker"
with:
worker_id: "{{worker_id}}"
outputParameters:
- name: full_name
type: string
mapping: "$.fullName"
- name: department
type: string
mapping: "$.department"
- name: title
type: string
mapping: "$.jobTitle"
- name: email
type: string
mapping: "$.workEmail"
consumes:
- type: http
namespace: workday
baseUri: "https://wd2-impl-services1.workday.com/ccx/api/v1"
authentication:
type: bearer
token: "$secrets.workday_token"
resources:
- name: workers
path: "/workers/{{worker_id}}"
inputParameters:
- name: worker_id
in: path
operations:
- name: get-worker
method: GET
Creates a Zoom meeting for mine site coordination calls.
naftiko: "0.5"
info:
label: "Zoom Meeting Scheduler"
description: "Creates a Zoom meeting for mine site coordination calls."
tags:
- collaboration
- zoom
- meetings
capability:
exposes:
- type: mcp
namespace: meeting-scheduler
port: 8080
tools:
- name: create-zoom-meeting
description: "Create a Zoom meeting with specified topic and duration."
inputParameters:
- name: topic
in: body
type: string
description: "The meeting topic."
- name: duration
in: body
type: integer
description: "Meeting duration in minutes."
- name: start_time
in: body
type: string
description: "Meeting start time in ISO 8601 format."
call: "zoom.create-meeting"
with:
topic: "{{topic}}"
duration: "{{duration}}"
start_time: "{{start_time}}"
outputParameters:
- name: join_url
type: string
mapping: "$.join_url"
- name: meeting_id
type: string
mapping: "$.id"
consumes:
- type: http
namespace: zoom
baseUri: "https://api.zoom.us/v2"
authentication:
type: bearer
token: "$secrets.zoom_token"
resources:
- name: meetings
path: "/users/me/meetings"
operations:
- name: create-meeting
method: POST