banking

RWA Is a Calculation. Data Quality Is What Gets It Approved.

20 March 2026 · 6 min read

I governed market risk capital reporting across Citi entities covering more than $3 trillion in annual settlement value. This post reached 4,500 impressions on LinkedIn — which tells me the observation resonates with practitioners who have sat in regulatory reviews. I have expanded it here with the full framework.

There is a pattern I observed across every regulatory review of market risk capital reporting I was involved in at Citi. It never varied.

The first question was always the same: What is your VAR?

The second question was always the same: Walk me through the data.

Banks that stumbled on the second question did not get to defend the first. The VAR number was irrelevant once the regulator had identified a gap in the data infrastructure behind it.

RWA is a calculation. Data quality is what determines whether that calculation is trusted.


What regulators are actually reviewing

Basel 2.5 capital requirements — stressed VAR, incremental risk charge, comprehensive risk measure — are technically achievable by most banks with mature risk systems. The mathematics is well-understood. The regulatory expectations are documented. The models, at this point, are not the differentiator.

What differentiates a clean regulatory review from a remediation programme is the data infrastructure that sits underneath the model.

Regulators are not primarily model validators during a market risk capital review. They are data auditors. They want to know: where does each position come from, what price source was used, who approved any deviation from that source, how was the P&L explained, and what happened on the days when the model's prediction and the actual result diverged.

Every one of those questions is a data question. None of them is a model question.

The banks that perform well in regulatory review are the ones that built their data infrastructure to answer those questions cleanly — before the review, not in response to it.


The four mandates I applied across every trading book program

Across the market risk capital programs I ran at Citi, I mandated four non-negotiable data discipline requirements. They were not sophisticated. They were operational. And they were the difference between programs that passed regulatory review cleanly and programs that generated findings.

1. Every position mapped to a single authoritative price source

Position data that flows into RWA calculation must have a clear, documented, single source of truth. Not the most convenient source. Not the fastest source. The authoritative source — the one that the risk function, the finance function, and the operations function all agree on.

In practice, this requires resolving conflicts between front office systems, risk systems, and general ledger systems that have accumulated over years of parallel development. That resolution work is unglamorous. It is also the foundation everything else stands on.

If a regulator asks where a position came from and the answer involves a spreadsheet reconciliation between three systems, the review is already in difficulty.

2. Every override logged with reason and approver

Overrides are legitimate. Prices are sometimes stale. Proxies are sometimes necessary. The issue is not that overrides exist — it is that they frequently exist without documentation that explains why they were applied and who authorised them.

I required every override to carry a structured reason code, a free-text explanation, a named approver, and a timestamp. Not as a bureaucratic exercise. Because a regulator reviewing a stressed VAR calculation that relies on overridden prices needs to understand whether those overrides reflect genuine market conditions or data quality workarounds.

The audit trail for overrides is also a risk management tool. Patterns in override frequency, by desk or by instrument type, are early warning signals that the underlying data infrastructure has a problem that needs to be resolved rather than managed around.

3. Every proxy flagged with methodology documented

Proxies — using one instrument's price to represent another's where direct market data is unavailable — are a standard feature of market risk measurement. Illiquid positions, complex structures, and emerging market instruments frequently require proxy treatment.

The requirement I enforced was that every proxy be flagged in the system, with the methodology documented: what instrument was used as the proxy, why it was selected, what the correlation basis was, and when it was last reviewed.

Undocumented proxies are a significant regulatory exposure. When a regulator asks why a position in one instrument was priced using data from a different instrument, "that is how the system works" is not an acceptable answer. The methodology must exist, be documented, and be defensible.

4. P&L attribution reconciled daily — not monthly

Monthly P&L attribution reconciliation was the norm in several programs I inherited. It is insufficient.

Daily P&L attribution — explaining the day's profit and loss in terms of the risk factors that drove it — is the operational heartbeat of a defensible market risk framework. It surfaces data quality issues immediately, when they can be investigated and resolved, rather than at month-end when the window for clean explanation has closed.

It also provides the daily evidence trail that regulators require to assess whether the VAR model is performing as expected. A model that consistently over- or under-predicts actual P&L is not a well-calibrated model — and that pattern only becomes visible if attribution is running daily.


Build the defensibility before you build the model

The sequencing error I have seen in market risk capital programs is consistent: organisations invest heavily in model sophistication and light in data infrastructure. They build an advanced risk model on top of data that cannot be defended under scrutiny.

The result is a program that performs well in internal testing and poorly in regulatory review. The model is not the problem. The data is.

The correct sequence is the reverse. Build the data infrastructure first. Establish the authoritative price sources, the override governance, the proxy documentation, the daily reconciliation cadence. Run that infrastructure for a reporting cycle and validate that it produces clean, defensible outputs before the model is layered on top of it.

This is not risk management sophistication. It is operational discipline. It is the kind of discipline that does not generate conference presentations or technology award nominations. It generates clean regulatory reviews and avoided remediation costs — which, at the scale of a Tier 1 bank's trading book, are material outcomes.

Build the defensibility before you build the model.

Everything else is architecture on an unstable foundation.


The Basel IV implication

The Fundamental Review of the Trading Book — Basel IV's market risk framework — tightens every one of these requirements significantly. P&L attribution testing becomes a formal regulatory threshold, not an internal management tool. The boundary between banking book and trading book is more precisely defined and more closely scrutinised. Internal model approval requirements are more demanding.

The banks that will navigate Basel IV implementation most cleanly are the ones that built the data infrastructure discipline under Basel 2.5. Not because the frameworks are identical — they are not — but because the underlying operational habits are the same: authoritative sources, documented overrides, traceable proxies, daily reconciliation.

If your Basel 2.5 data infrastructure cannot pass the four tests above, your Basel IV readiness programme has a foundation problem that model upgrades will not solve.


Frequently asked questions

What is RWA in banking?

Risk-Weighted Assets is the calculation that determines minimum capital requirements under Basel frameworks. The calculation is well-understood. What determines whether it survives regulatory scrutiny is the quality and traceability of the input data — position sources, price overrides, proxy methodologies, and P&L attribution.

What do regulators actually look at during a Basel RWA review?

Regulators follow a consistent pattern — VAR number first, then walk me through the data. Every position needs an authoritative price source. Every override needs a logged reason and approver. Every proxy needs documented methodology. P&L attribution needs daily reconciliation. Banks that cannot answer the data questions do not get to defend the model.

Why do RWA programs fail regulatory review?

Not because the calculation is wrong but because the data infrastructure behind it cannot be defended. Common failure points: positions mapped to non-authoritative sources, overrides with no audit trail, proxy methodologies undocumented, P&L attribution reconciled monthly rather than daily. Regulators probe these points systematically.

What is the difference between Basel 2.5 and Basel IV for market risk?

Basel 2.5 introduced stressed VAR and incremental risk charge. Basel IV replaces the internal models approach with FRTB — more prescriptive, higher data quality requirements, formal P&L attribution testing thresholds. The data discipline built under Basel 2.5 is a direct prerequisite for Basel IV readiness.


Found this useful? I write weekly on banking technology, regulatory capital, data engineering, and the lessons 24 years at Citi and Standard Chartered taught me. Subscribe to the newsletter — no spam, unsubscribe anytime.

Working on a Basel implementation or market risk data infrastructure challenge? Explore my consulting services or get in touch directly.

Raj Thilak is Head of Technology for Data & Analytics with 24 years at Citi and Standard Chartered. He has governed market risk capital reporting across entities covering more than $3 trillion in annual settlement value and led Basel 2.5 implementation programmes across multiple jurisdictions. Based in Pune, India. rajthilak.dev

Related Reading

Found this useful? Subscribe for weekly insights.

Discussion

Join the conversation

Loading comments...

Leave a comment
0 / 2000