Naftiko Signals
Naftiko Signals pulls publicly available blog posts, press releases, job postings, and social media for companies to gather the signals coming out of companies across a diverse range of industries, looking to understand the different types of investments companies are making. We evaluate the signals across 25+ areas to understand where each enterprise is in their digital journey.
Measuring the over API investment, from being API-first to design-first, to full lifecycle API management to understand where they are in their API journey.
Services
Tools
Standards
Areas
Measuring the AI investment occuring from ChatGPT usage to MCP to investing in agentic automation, evaluating a companies grasp of it.
Services
Tools
Standards
Areas
Measuring the integration investment involving iPaaS, embedded iPaaS, but also legacy approaches with ETL, batch, and other common ways of integrating.
Services
Tools
Standards
Areas
Measuring the data investment, and how strong the data teams are, and what are thy focused on from access, quality, analytics, to governance and compliance issues.
Services
Tools
Standards
Areas
Measuring the database investment, and what database platforms are in use, and what the database tooling that is in use across teams to provide data access.
Services
Tools
Standards
Areas
Measuring the automation investment in all of it's forms to understand how sophisticated automation is, and how much it is being applied across operations.
Services
Tools
Standards
Areas
Measuring the platform investment, and where a company is at in their platform journey, evaluating what common services, guard rails, and roles are in place.
Services
Tools
Areas
Measuring the event-driven investment, and looking at the types of APIs in use, and the technology they are using that is steering them towards event-driven.
Services
Tools
Standards
Areas
Measuring the governance that is occurring, and how focused it is on APIs, as well as aligned with wider security, compliance, and other aspects of governance.
Services
Tools
Standards
Areas
Measuring the security investment, and whether or not it is still more application focused or has evolved to be more API-centered, as well as thinking about AI.
Services
Tools
Standards
Areas
Measuring the formal investment into understanding and managing the SaaS portfolio, which is separate from just quantifying the scope of portfolio and the number of services in use.
Services
Areas
Measuring the maturity of cloud, SaaS, AI cost management — user, usage plans, and token-level cost tracking, inference spend forecasting, model cost-performance optimization, GPU utilization monitoring, and chargeback models for shared AI infrastructure.
Services
Tools
Standards
Areas
Measuring the container investment, beginning with Docker, but moving to the cloud, and where Kubernetes is in their overall platform journey with containers.
Services
Tools
Standards
Areas
Measuring the state of observability, how they are monitoring, testing, tracing, and reporting on their operations via dashboards, and other approaches.
Services
Tools
Areas
Measuring the virtualization investment including data, examples, synthetic data, but also API mocking, and other ways companies are virtualizing resources.
Services
Tools
Standards
Areas
Measuring the operational investment, and how much they think about the big picture strategy of their operations, and how they can be improving.
Services
Tools
Standards
Areas
Measuring the business alignment investment, and are they doing work to bridge engineering with business, and invest more into the productization of APIs.
Services
Standards
Areas
Measuring the open-source investment, and how much open-source they use, but also potentially contribute to, and even if they are investing in inner source.
Services
Tools
Standards
Areas
Measuring the standardization investment, beginning with what standards they intentionally or unintentionally use, but also their strategic approach.
Services
Tools
Standards
Areas
Measuring the different patterns in use across the different types of APIs, but also the parts and pieces of integrations, to understand the diversity of patterns.
Services
Tools
Standards
Areas
Measuring the specifications in use, such as OpenAPI, AsyncAPI, and JSON Schema, but also newer formats like A2A, MCP, and other AI specs.
Services
Tools
Standards
Areas
Measuring the code investment, and what libraries and frameworks are in use, as well as any software development kits that provided or being applied for integrations.
Services
Tools
Standards
Areas
Measuring the Apache tooling investment, and what projects are in use, and how they are leveraged as part of operations, including involvement in community.
Services
Tools
Standards
Areas
Measuring the CNCF tooling investment, and what projects are in use, and how they are being leveraged as part of operations, including involvement in community.
Services
Tools
Standards
Areas
Measuring the cloud investment, beginning with which clouds they use, but then looking at their the approach to managing the technical and business side.
Services
Tools
Standards
Areas
The entire SaaS portfolio for companies, beginning with the number of services, but then also evaluating which are infrastructure, platform, or more business.
Services
Tools
Standards
Areas
Which programming languages are used by teams, understanding the diversity of languages in use, and the relationship to services and tooling.
Services
Tools
Areas
Measuring how many mergers and acquisitions are conducted, and how their operations is due to years of this M&A approach to innovation.
Services
Areas
Measuring investment in training and fine-tuning data pipelines — how organizations curate, label, version, and govern the proprietary datasets used to customize models, including text, image, audio, and video corpora.
Services
Tools
Standards
Areas
Measuring whether enterprises are tracking which models (base, fine-tuned, adapted) are deployed where, including version lineage, performance baselines, and rollback capabilities.
Services
Tools
Standards
Areas
Measuring the investment in processing non-text data — document extraction (OCR, PDF parsing), image and video analysis, audio transcription, and the pipelines that normalize these inputs for model consumption.
Services
Tools
Areas
Measuring the degree to which organizations are building or procuring domain-specific models versus relying on general-purpose models, and the regulatory or compliance drivers behind that choice.
Services
Areas
Measuring the investment in AI-specific testing — eval frameworks, regression benchmarks, hallucination detection, RAG accuracy scoring, and agent task completion rates as part of CI/CD and production monitoring.
Services
Tools
Standards
Areas
Measuring how enterprises are instrumenting AI developer workflows — adoption metrics for coding assistants, productivity baselines, internal satisfaction surveys, and the feedback loops between developers and AI platform teams.
Services
Tools
Areas
Measuring whether organizations have connected AI system performance to business outcomes — time saved, error reduction, customer satisfaction, cost avoidance — or whether measurement remains purely technical.
Services
Tools
Standards
Areas
Measuring how enterprises are responding to AI regulation — EU AI Act classification, risk assessments, model documentation, and whether compliance is proactive or reactive.
Services
Tools
Standards
Areas
Measuring whether formal AI review processes exist — review boards, use case approval workflows, model risk tiering, and the speed at which new AI use cases move from proposal to production.
Services
Tools
Areas
Measuring investment in AI-specific privacy infrastructure — consent management for training data, right-to-deletion compliance across memory systems, data lineage tracking, and cross-border data flow management for model training and inference.
Services
Tools
Standards
Areas
Measuring the deliberateness of model provider and infrastructure choices — single vs. multi-provider strategies, contractual terms, switching costs, and the balance between proprietary APIs and self-hosted open-source alternatives.
Services
Tools
Areas
Measuring the strategic AI partnerships announced and in practice — cloud AI partnerships (Azure OpenAI, AWS Bedrock, GCP Vertex), model provider relationships (OpenAI, Anthropic, Cohere, Mistral), and how these partnerships shape or constrain architectural choices.
Services
Areas
Measuring how enterprises are staffing AI initiatives — new roles (ML platform engineer, AI product manager, prompt engineer), team structures (centralized AI teams vs. embedded), and the skills gaps that job postings reveal about organizational readiness.