Documentation Index
Fetch the complete documentation index at: https://docs.platform.statista.ai/llms.txt
Use this file to discover all available pages before exploring further.
07. May 2026
MCP Server
- Release of the Market Insights MCP tools for both
/v1/mcpand/v1/mcp/copilot, supporting searching and fetching market insights and forcasts. For more details visit the documentation.
24. April 2026
MCP Server
- The Consumer Insights fetch tool description/prompt was too long for usage in some models. We made a new, more compact prompt, including guidelines to prevent false usage.
- The Consumer Insights fetch tool input schema includes additional information to guide LLM towards better inputs.
22. April 2026
MCP Server
- The Consumer Insights fetch tool now supports calculating crosstabs across different splits. If no participant answered both questions the data is filled in based on demographics.
15. April 2026
MCP Server
- Prompts/descriptions updated for Statistics and Consumer Insights tools: Unnecessary whitespace and line breaks removed.
- Tool parameter descriptions moved from the description to the input schema, to make them more compact.
07. April 2026
MCP Server
- Support for searching in Consumer Insights surveys and fetching detailed cross-tabulated data for survey question and answer combinations.
- Consumer Insights MCP tools are available on the
/v1/mcpand the Copilot specific endpoint/v1/mcp/copilot.
October 2025
REST API
- Chart images are now displayed in source language.
total_countfield now shows correct number in/search/statistics.- Similarity score for search results appended to
/search/statisticsunderranking_score. - Update API timeout from 5s to 30s to allow longer-running operations and avoid breakage in client systems.
- Multilingual search queries for
/search/statistics - Extended authorization to include both
x-api-keyas custom header andAuthorization: Bearer.
- Support DeepResearch response format and tool names under
/beta/deepresearch. - Coerce ID parameter to avoid runtime errors when LLMs pass in a numerical string.
- Automate MCP Client compatibility tests across LLM providers: OpenAI, Anthropic/Claude, Gemini.
- Condense output of
search-statisticstool to reduce LLM token consumption by 35%