This is the 8th edition of our industry newsletter with musings, observations and ideas regarding the challenges and opportunities facing market data management leaders.
When the exchange comes knocking: Your audit readiness checklist
By Eimear Keane, Senior Manager of Market Data Compliance, TRG Screen
👉 BONUS job aid: We’ve created a Market Data Audit Readiness Checklist to help you structure the process from start to finish.
Exchange audits are a fact of life for every market data team, but the process has changed a lot in recent years.
Before COVID, most were handled in person. Auditors would spend a few days onsite, walk through everything side by side with the team, and usually reach conclusions pretty quickly. Now, almost the entire process happens remotely – over email, shared files and calls.
That shift has its pros and cons. It’s easier to coordinate across regions, but without those in-person sessions, things can easily lose pace. Questions bounce around on email, clarifications take longer and what was once a focused exercise can start to drift. More importantly, can the scope – and thus your risk -- start to creep.
For teams already juggling renewals, declarations and reporting cycles, an audit can quickly feel like one task too many. The secret to keeping it manageable? Preparation! Knowing what’s likely to be asked for, having the evidence ready and being clear on how you’ll respond.
The goal of an exchange audit
At its heart, an exchange audit is straightforward. It’s simply checking that how you use the data matches what you’re licensed to do and that the correct number of users have been reported. Auditors will look at who’s using it, where it sits, how access is controlled and whether any redistribution, derived data or non-display use needs extra licensing.
With audits now covering more systems, vendors and time periods than ever, good documentation really is your best friend. Having everything in order from the start keeps the process moving and perhaps can boost your credibility (and your confidence!) when the exchange comes knocking.
How to get ahead
The good news is that most of the delays and frustrations can be avoided with a clear process. Here’s how to make sure you’re ready before the next audit letter lands in your inbox.
- Lock down confidentiality early. It might seem obvious, but it’s amazing how often NDAs trip people up. Before any audit begins, check that your confidentiality agreements actually cover everyone who needs to be involved – including third parties and regional entities – and that there’s a secure way to share files. A quick housekeeping exercise up front can save hours of chasing down signatures later.
- Get your application inventory in shape. Auditors will always ask for a clear picture of where and how exchange data is used. Could you produce that list today – the applications, the data types, and who’s responsible for them? Having this ready doesn’t just speed up the process; it sets the tone that you’re in control and your house is in order.
- Check license alignment. Use this as an opportunity to revisit your use cases. Are you still operating under the same terms you were when the license was first agreed? Have new use cases crept in – non-display, derived, redistribution – that might change the calculation? Exchange policies evolve quickly, and those subtle shifts are often where compliance gaps appear.
- Validate your entitlement data. This one can make or break an audit. Before you send anything across, double-check that your entitlement records tell a complete story – that users, devices and access align with what’s been declared. It’s the kind of detail that feels tedious until you’re trying to explain inconsistencies a few weeks into an audit.
- Review, negotiate and close the loop. When the draft liability report arrives, take the time to go through it carefully. Challenge the anomalies, clarify the assumptions and make sure remediation steps are written down and agreed by everyone. A good close-out process is what turns an audit from a headache into a valuable learning exercise, and sets you up better for the next audit.
Final thought
Let’s be honest, no one loves an audit. And they can draw a lot of attention from management. But they don’t have to be painful. A little preparation goes a long way. With the right information close to hand, you can turn what feels like an interruption into a quick, structured exercise that strengthens control, keeps your costs in check – and leave you looking like a highly organized recordkeeping superhero to your leadership team.
AI in market data: How to make it work for you
With Amjad Zoghbi, Head of AI, and Ian Pilbeam, Market Data Consultant, TRG Screen
Everyone’s talking about AI, everywhere you turn.
With new assistants and automation tools continuing to appear across the industry, the challenge is no longer to understand what AI can do, it’s knowing where it genuinely adds value.
For market data, operations and procurement teams, the potential is huge. But so are the misconceptions.
Whether the push comes from your own firm or your vendors, you have more influence than you think. The teams that win won’t just adopt AI; they’ll help shape it.
We caught up with Amjad Zoghbi and Ian Pilbeam to distill the lessons from TRG Screen’s own AI deployments – both internally in job aids and in products – and what every market data professional should keep in mind as AI becomes part of daily business.
Start with the problem and a clear measure
“Behind the noise and the glitter, AI is still just a tool,” says Amjad. “Before you think about what it can do, think about the problem you’re trying to solve and, even more importantly, the value to your business of solving it.”
That means having a clear problem statement and success metric from the start. If the goal is to save five minutes per license validation, for example, then make that explicit. When those minutes add up across a team, you’re talking measurable productivity.
Ian adds: “A lot of people still carry scar tissue from the ‘big data’ era. They saw grand projects that promised the world and didn’t deliver. The message this time is to focus. Use AI like a scalpel, not a sledgehammer – targeted and strategic.”
People and expertise shape AI
AI works best when the people who use it help shape it.
“When people can see their fingerprints on a tool, they trust it,” says Amjad Zoghbi. “It also helps to position AI as an enabler – it’s there to help people think faster and work smarter, not to replace them.” We’ve done that by involving employees in the build and refinement of internal AI tools, and by involving users in designing and building the AI we recently launched in two of our technologies. It makes all the difference.
That’s how real expertise gets built into the technology. “The most effective AI isn’t defined by its algorithms but by the human knowledge behind them – the business rules, judgement and context that guide how it behaves,” adds Amjad.
Inside your firm, draw on the people who know your workflows best. (They don’t have to be AI experts at all. They just need to understand the day in the life of potential users.) Externally, ask vendors how they’ve embedded domain knowledge into their models and how often it’s refreshed. Tools grounded in real market data experience will always outperform those built in isolation.
“Anyone can build an AI that extracts fields from a document,” Amjad adds. “It’s the blend of technology and experience that creates real value and builds a competitive moat.”
Ian agrees: “The firms that benefit most are the ones that stay close, shaping how AI is applied, not just accepting what’s delivered.”
Data quality still rules
Forget big data; think better data.
“Garbage in, garbage out hasn’t changed,” says Ian. “If your data is messy or incomplete, don’t expect an algorithm to make sense of it.”
Amjad agrees: “Before you think about AI, look at the data you already have. Without that foundation, it’s hard to get meaningful results. That’s something we’ve spoken about many times in this newsletter, and a drum we continue to beat. Data on one hand and clean API access to enable access to that are both key for AI to be successful.”
For market data teams, that means data that’s structured, reliable and relevant to the task at hand, and continuously refined by people who understand what ‘good’ looks like. Be mindful of these basics and expect the same discipline from your vendors.
“Don’t just ask what the AI can do,” Amjad advises. “Ask what data it’s built on, how it learns and how transparent the process is.”
Clean, well governed data isn’t just good hygiene; it’s the single biggest predictor of meaningful AI outcomes.
Trust what you can explain and govern what you build
Explainability and governance are two sides of the same coin. In a risk and compliance-driven environment, AI can’t be a black box.
“An AI that gives you an answer but not a reason isn’t good enough,” says Amjad Zoghbi. “Users need to understand why it produced that answer, not just what it is. That transparency is what builds trust.”
Ian agrees: “If you can’t see the logic trail, you can’t trust the output. That’s where strong governance meets good design.”
Both stress that enthusiasm must be balanced by oversight. “Anything creative – people or algorithms – needs governance,” Ian adds. “Without checks and balances, you risk introducing new blind spots faster than you fix the old ones.”
For buyers and users alike, that means pushing for clear answers on data security, model transparency, bias prevention and auditability, and ensuring AI tools fit within control frameworks.
Keep measuring and learning
AI adoption isn’t a one-off project. Measure not just accuracy but engagement and impact. Where is the tool saving time? Where does usage plateau? Which teams are leaning in and which are holding back?
“Usage data tells you what’s really happening,” Amjad explains. “If one department uses it constantly and another barely touches it, find out why. That’s how you learn and improve.” For example, as we’ve deployed AI agents across teams such as sales, operations and marketing, we’ve closely monitored detailed usage data so we can provide coaching to teams and even individual users. This helps them become more confident and creative users, helps us get more ROI out of our investment in these tools, and helps us train the AI tools to be even more effective.
A practical mindset
AI in market data isn’t about applying it for its own sake. It’s about improving how teams work, make decisions and deliver results.
“Ten years ago, no one could stay competitive without moving to the cloud,” Amjad says. “The same is now true for AI – but only if it’s grounded in value, data and security.”
And as Ian concludes: “AI will reshape market data operations, but it won’t change the basics. Define your objectives, understand your data and keep your governance tight. That’s how you make AI work for you, not the other way around.”




