As market structure evolves from centralised, venue-based models to more fragmented, network-driven ecosystems, data is seen increasingly as a primary determinant of effective market access, execution quality and competitive differentiation. A recent Markets Unstructured report (March 2025) highlights that alongside connectivity and analytics - control over data is fundamental to liquidity access and price formation in modern markets.
In this environment, the question then is how effectively is data deployed and leveraged across trading lifecycles and associated operational workflows.This approach is a demonstrable shift from straightforward data ‘consumption’ to proactive data structuring, integration and interrogation.
For firms operating in today’s multi-asset, data-driven markets, and subject to growing “information asymmetry”, this distinction is critical: Those able to transform data into decision-ready intelligence are better positioned to navigate fragmented market liquidity, manage risk and deliver consistent outcomes.
As Andrew Jappy, EGM APAC at Iress TMD, notes: “At Iress TMD we don’t simply redistribute raw market data… we’re enabling an open environment where clients can access their own trading and market data securely, and interrogate - and use - it however they choose. As such, value extends beyond the data itself to the ease with which firms can access, combine and apply it across their own cross-enterprise workflows.”
Historically, market data infrastructure focused on delivering raw information - prices, quotes, order book updates, streamed directly from exchanges. But raw data does not necessarily support efficient trading and confident investment decisions.
Modern trading and advisory environments rely increasingly on more integrated datasets that combine price data with other data, such as reference, corporate actions, historical records and alternative data. Collectively, these data sets enable decision-ready intelligence that can be analysed, contextualised and acted upon in real time.
This shift also underscores a broader transformation across financial markets. Data is no longer something simply displayed on a screen; it is embedded directly within trading workflows, analytics and client interactions.
Firms today face a complex set of pressures with respect to data sourcing and resource allocation.
On one hand, the scope and importance of data is expanding continuously. Supporting modern trading and advisory workflows demands broader coverage across asset classes, deeper historical datasets and new data sources, including alternative data and derived analytics.
On the other, execution venue fragmentation and inconsistent data standards make it difficult to build - and maintain - a comprehensive and complete and reliable view of the market. As highlighted in the Markets Unstructured report, market participants may be having to source, cleanse and reconcile data from multiple providers to construct a usable dataset - creating duplication, workflow inefficiency and uneven outcomes for firms across the industry.
Many financial firms commit significant resources to data, yet are challenged with translating that investment into consistent and valuable decision-ready insights. Fragmented internal ‘data ownership’, duplicated permissions and entitlements and disconnected platforms and systems mean that organisations often lack a single, trusted view of their data.
This dynamic is contributing to increasing information asymmetry across markets. Firms with the capability to normalise and analyse large datasets efficiently are better able to generate a comprehensive view of liquidity and pricing.
As Andrew Jappy observes: “More and more, we’re seeing that clients want to work with their data in their own environments - combining it with other sources, applying their own analytics and extracting insights specific to their workflows.”
This represents a clear shift from fragmented, desk-level consumption toward enterprise-wide data capability, where information is accessed and utilised actively (and proactively) across trading, compliance and client engagement workflows.
As trading environments become more sophisticated, the value of market data lies not only in access, but in quality, timeliness and structure. Inaccurate and inconsistent data can quickly lead to operational friction - from valuation errors to reconciliation breaks and compliance risk. At the same time, delays in data delivery undermine execution decisions and client communication.
Perhaps most critically, data must be structured consistently. Without normalisation across markets and venues, firms cannot effectively compare execution outcomes, generate analytics or integrate data into trading workflows.
High-quality data is therefore not simply a technical requirement - it is a critical foundation for decision-making, risk management and client service.
Another important shift is the way data insights are integrated into the trading workflow itself. Traditionally, analytics were consulted after the fact, through reports and in dashboards separated from the trading environment.
Increasingly, however, insights are embedded directly within execution and advisory workflows, for example:.
This approach ensures that insights are delivered at the moment decisions are made, rather than after trades have occurred. For advisers and brokers supporting retail and wholesale investors, this integration strengthens both decision quality and accountability.
Another driver behind evolving market data strategies is regulatory scrutiny. Across jurisdictions, regulators expect firms to maintain detailed records of trading activity and to be able to retrieve that information quickly if required.
Historically, the available window of accessible data in trading ecosystems has been limited, and accessing older data has been slow and resource-heavy. Andrew Jappy observes:
“Depending on the size and type of user firm, data might only be available for a month or at most perhaps twelve months. If the regulator asks for data outside of this limited window, and indeed from several years ago, our customers would have to request that we locate and restore it, which is time-consuming and expensive.”
Cloud-based data environments (data lakes and warehouses) help address this challenge by enabling longer-term storage and faster retrieval of trading records, improving both compliance readiness and operational efficiency.
These developments reflect a broader shift toward data-first infrastructure strategies across the trading industry. Rather than treating market data as a secondary input, firms are designing environments where data accessibility, integration and analytics are built into the core architecture.
A data-first strategy would typically embrace::
According to Andrew Jappy, this approach ultimately empowers clients to determine how their data delivers value: “Our intention is to create a more open environment where clients can securely access their own data and decide how they want to use it - whether through our dashboards, their own analytics tools or third-party providers.”
Handled strategically, market data becomes a highly valuable strategic asset - a whole that is significantly greater than the sum of its individual parts. It lays the foundation for better decision-making, stronger compliance, faster time to market and more resilient participation in increasingly complex financial markets.
The firms that succeed will not necessarily be those that invest the most in data, but those able to extract the greatest value from it.