As Covid‑19 impacts the autocallables business, solutions to navigate new challenges are crucial

By Murex | Advertisement | 16 September 2020

With pandemic losses impacting derivatives activity, Murex’s MX.3 software solution for autocallables comes to the fore

Equity structured products are big business for the largest financial institutions worldwide. The autocallable is the uncontested number one equity exotics landscape offering.

In a fiercely competitive market, financial institutions often focus on honing a few autocall variations to stand out.

Final clients are usually individuals with different wealth profiles – such as retail clients in South Korea or wealth management clients in Singapore – with one or several distribution intermediaries between them and the product issuer.

There are also strong regional particularities in the type of autocallables sold. South Korea has seen the emergence of Lizard and then the Komodo autocall, which then spread in South-east Asia – the daily range accrual is also very popular in this region. In Russia, half of the term sheets have different observation types for the knock-out and the final payout.

As the Covid‑19 pandemic has severely impacted derivatives activity, the importance of a software solution for autocallables has been underscored. 

Managed well, an autocall business can be very profitable. It is also very risky. Issuers love the autocall business – in bullish markets it can early-redeem very quickly, in which case a new investment product can be sold to the same client while rolling the hedge, generating more frequent margins. In turbulent markets, however, it can be quite a different story.

Though quite stable since the financial crisis that began in 2007–08, dividends were a driver of pandemic losses incurred by some equity derivatives desks. These dividends were already announced and were therefore not hedged. Past losses have also been caused by the peak vega – the sudden drop of vega that requires hedge buying options at a high price while they were sold at a low price. 

There are several factors required to build a profitable autocallable business:

Currently used by more than 20 clients, including issuers, the MX.3 solution provides support by leveraging its integrated platform model and strong investment in the autocallable business line

Jean-Baptiste Dusson, head of equity derivatives product management, Murex

This has proven essential amid recent market volatility – Murex’s comprehensive set of risk measures, accurate trade representation and overall capacity to manage rising volumes have been important for navigating the new reality. 

Real-time portfolio management and overhedge techniques 

Going forward, it is essential to have a real-time decision-making tool to efficiently operate and manage specific autocallable risks. Understanding the challenges these aspects present and having the IT solution to overcome them is critical.

As for any non-linear activity, hedging the spot risk is the top priority. When selling an autocallable, a trader is short delta. Near payout discontinuities, to avoid the liquidity risk of suddenly having to buy or sell important quantities of underlying, traders rely on overhedge techniques, which typically include replacing barriers and digitals by conservative call spreads, sometimes applying a bend on the knock-in barrier.

Those pricing adjustments smooth the net present value and the Greeks profile, enabling more secure delta hedging. Traders choose the size of the call spread depending on stock liquidity and digital risk, the cost of which is supported by investors.

Managing vega is also critical – it is more complex than for vanilla products. Using a global vega for an autocall lacks precision. Traders require a topography by strike and maturity to identify where risk is concentrated. Getting this topography right on a sizeable book in a timely manner is a quantitative challenge.

During quiet periods, dividend risk can easily be hedged once properly bucketed by maturity. However, in more turbulent markets, it must be complemented by what-if scenarios, because dividends variations can suddenly become very large.

Real-time portfolio management supported by robust analytics and reliable, scalable architecture is at the core of Murex’s MX.3 solution for steering an autocall business. The solution also includes real-time risk ladders, which provide a good illustration of positions in the autocall books by showing deformation of the main risk figures as market data shifts. It has several views to control operational risks, including barriers and fixings management, upcoming expiries and dividends. Its analysis tool, PL Explain, saves precious time for traders, illustrating their daily profit-and-loss variation by risk factor.

Models that home in on what matters 

In recent years, autocallable losses have been attributable to limited capture of spot-volatility dynamics – this is due to an overreliance on local volatility models. Those models are not successful at capturing the path-dependent nature of autocallables. 

Zooming into the spot volatility dynamics when selling an autocall, one is short vanna, which could be very problematic in bearish times: when the spot goes down, the vega increases to a certain level and then drops sharply until barrier is reached (see figure 1). 

Stochastic volatility models capture these nuances. These models enable a good fit to vanilla options, while providing realistic forward probabilities, bridging a gap. Striking the right balance between the local and stochastic volatility dimensions, local stochastic volatility (LSV) models allow for spot-vol dynamics control and match real market behaviour.

Typically, stochastic volatility tends to be higher in Asia, due to the predominant autocall business in the region and the lack of diversification in market participants. Murex researchers have observed this by analysing historical long-term skew stickiness ratio (SSR), a metric introduced by Bergomi to observe the dynamics in a proportion of the skew. While it is close to the SSR induced by a sticky strike dynamic for the S&P 500 and Euro Stoxx 50, it is clearly below for Asian indexes such as the Hang Seng China Enterprises Index and Nikkei. In other words, in markets saturated with autocallable, ‘spot up equals vol down’ is less pronounced.

The autocallable knock-in put is the part of the payout most sensitive to stochastic volatility. Market actors know the longer the maturity and the lower the early redemption barrier, the higher the impact of considering stochastic volatility. The ‘worst of’ feature of multi-underlying autocallables also increases the impact of stochastic volatility. This explains why major players use an LSV model as a benchmark for pricing and taking provisions on their books on a regular basis.

Despite well-known limitations, the local volatility model is still the market standard for intraday risk management. The MX.3 solution features a local volatility model used by Murex clients, which is powered by graphics processing units to accelerate Monte Carlo computations, increasing accuracy, and an LSV model, which serves as a cutting-edge alternative for pricing.

According to our studies, the impact of considering LSV on mono-underlying standard autocall could reach up to 100 basis points compared to standard local volatility pricing

Cherif Ben Mlouka, product manager for equity derivatives, Murex

Trade representation and operations 

Managing an autocall business isn’t exclusively a front-office challenge.

As a starting point, you need to have clean representation of trades for easy inputs and to ensure the back-office chain can be optimally automated. Otherwise, it can become very costly. 

The MX.3 solution has been enriched to cover regionally specific financial clauses in various formats, from swaps to structured notes, leveraging on Murex’s payout language to meet time-to-market challenges. The lifecycle is managed seamlessly, thanks to the front-to-back integration of the platform. This includes, for instance, the option to exercise the final payout with a physical settlement, but also the automatic move of swap trades to fallback rates in anticipation of the Libor transition. And all this remains true for a client using its own analytics.

To manage the lifecycle of trades, a reliable system is necessary. When thousands of knock-in events activating the put at maturity need to be applied, you have to identify eligible trades, apply events properly, streamline investor notices, generate an accurate new version of the transaction with the put at maturity and anticipate the event-related sensitivities gap to rebalance hedges. 

One desk forgot to update some vol surface for a number of months and saw losses in the tens of millions

Jean-Baptiste Dusson, head of equity derivatives product management, Murex

Market data management is also a key aspect and is a source of operational risk when handling numerous correlation, volatility and dividend structures. 

Dealing with increasing volumes is critical to maximising profitability. From the digital distribution of the product (quoting on various platforms, responding to RFQs, sales-trader workflow) to trade processing (lifecycle management, including corporate actions), automation is key. Systems must be scalable, and this is particularly the case with risk management, which usually generates important hardware expenses as well.

Running a profitable autocall business and the various processes attached to it – including issuance, pricing, structuring, distribution, trade representation, market data sourcing and calibration, confirmation, settlements, events, and corporate actions, all while being able to manage risk – is a heavy lift. But it can be done. The Murex MX.3 solution makes it possible. 

Contact Murex at info@murex.com for more information about its solution to price and manage a large range of autocallables.

Op risk data: Revlon lenders won’t make up over Citi error

By ORX News | Opinion | 9 September 2020

Also: Cyber fines on the up; and more fat-finger fails of yore. Data by ORX News

Jump to In focus: fat-finger flubs | Spotlight: cyber fines

August’s largest loss event occurred at Citi, which accidentally wired $900 million to a group of lenders to cosmetics giant Revlon. The two sides were already locked in a dispute over a soured loan to the private equity-backed firm. As of August 21, Citi has not recovered a total of $520.4 million, for which the bank is now suing the lenders involved.

On August 11, Citi had sent notice to Revlon lenders, intending to pay them accrued interest payments. However, due to apparent issues with loan-processing systems, the payment to each lender was on average more than 100 times the interest due. Multiple news outlets suggest the bank inadvertently paid back both the loan principal and the accrued interest. Citi had intended to send one lender a total of $1.5 million, for example, but instead sent $176.2 million.

The bank was able to stop some of the payments, which totalled almost $900 million. It has now filed lawsuits against lenders, who are refusing to pay back the money, accusing them of holding on to money to which they are not otherwise entitled. The lenders argue that Revlon had defaulted on its loans, and thus were using the funds to pay back the loan. The day after Citi’s payment, Revlon’s lenders sued the company.

 

The second largest loss in August occurred at Scotiabank which was ordered by the US Department of Justice and the Commodity Futures Trading Commission to pay over $127.4 million for multiple instances of precious metals price manipulations.

Between January 2008 and July 2016, four traders at the firm placed multiple unlawful orders for gold, silver and other metals futures traded on commodities exchange Comex to deceive other traders and benefit their employer. The CFTC found the traders had been spoofing the market and had made false statements during the CFTC’s investigation. The charges also concerned swap dealer compliance and supervision violations.

In all, regulators found that over a seven-year period, the bank had concealed the bank’s full mark-up from counterparties for tens of thousands of swaps. The bank had violated various requirements relating to its counterparty onboarding process, record-keeping, chief compliance officer reporting and supervision, they found, and had made false or misleading statements to CFTC staff concerning its audio retention and supervision.

The CFTC ordered Scotiabank to pay $127.4 million, while the Department of Justice ordered it to pay a monetary penalty of $42 million, disgorgement of $11.8 million and victim compensation of $6.6 million. Up to half of the monetary penalty may be offset against the CFTC’s payment, however. ORX has therefore recorded the loss amount as $127.4 million until the settlements are finalised.

TD Bank paid $122 million in restitution over overdraft enrolment practices, placing it third in August’s largest losses.

The US Consumer Financial Protection Bureau found that, from January 1, 2014 until December 31, 2018, the bank had failed to obtain consumers’ affirmative consent to enrol them in the bank’s optional overdraft protection service.

It subsequently charged those consumers overdraft fees for ATM and one-time debit card transactions, violating the Electronic Fund Transfer Act and US Federal Reserve Board’s Regulation E guidelines for issuers of electronic debit cards.

The CFPB also found instances of TD Bank violating the Consumer Financial Protection Act of 2010 by engaging in abusive acts or practices by materially interfering with consumers’ ability to understand terms and conditions. For example, TD Bank presented the service to new customers as “free” or as a “feature” or “package”, despite charging customers $35 for each overdraft transaction.

After its investigation, the CFPB ordered TD to pay an estimated $97 million in restitution to 1.42 million affected customers and to pay a civil monetary penalty of $25 million. TD Bank said it disagreed with the CFPB’s conclusions and did not admit any wrongdoing.

In August’s fourth largest loss, Capital One was fined $80 million by the US Office of the Comptroller of the Currency for failing to establish effective cyber risk assessment processes from 2015 and to correct these deficiencies in a timely manner. The deficiencies were made evident by a data breach in April 2019, when a hacker stole the personal data of 100 million credit card applicants, as well as 140,000 social security numbers and 80,000 bank account numbers of existing credit card customers.

The banking watchdog found Capital One failed to establish effective risk assessments processes before transferring its IT operations to a cloud operating environment in or around 2015. Capital One also failed to establish appropriate risk management for the cloud operating environment, including appropriate design and implementation of certain network security controls, adequate data loss prevention controls, and effective dispositioning of alerts.

August’s fifth largest loss occurred at Interactive Brokers, which was fined a total of $38.7 million over anti-money laundering failings by the US Securities and Exchange Commission, the CFTC and the Financial Industry Regulatory Authority.

The fines related to Interactive Brokers’ failure to file suspicious activity reports (SARs) from at least July 1, 2016 to June 30, 2017. The SEC found that the firm ignored or failed to recognise numerous red flags, failed to properly investigate certain conducts required by its written supervisory procedures and failed to file SARs on suspicious activity. It also failed to review at least 14 deposits of US microcap securities where the security had been subject to an SEC trading suspension. These failures resulted from Interactive Brokers failing to implement a reasonable surveillance programme.

 

 

Story spotlight: Bank of Ireland, Capital One slammed by cyber fines

Two fines levied in recent months should serve as a warning to banks that regulators are taking a strong stand against inadequate cyber controls, which can have a major financial impact.

In the first case, the thieves’ means were modern, but the methods old-school, rooted in coaxing out sensitive information through successful phishing attempts.

In July, the Bank of Ireland was fined €1.66 million ($1.86 million) by the Central Bank of Ireland for breaches of Mifid regulations related to the bank’s former subsidiary, Bank of Ireland Private Banking Limited between November 2007 and January 2018.

Mifid fine for Bank of Ireland

The central bank’s investigation arose from a cyber-fraud incident in September 2014, where a fraudster impersonating a client made BOIPB make payments to a third-party account totalling €106,330. BOIPB’s procedures outlined steps to verify a client’s identity before processing a third-party payment instruction. However, BOIPB staff released confidential account details to the fraudster and did not ask security questions when taking transfer instructions. Nor did staff identify certain flags which could have been indicative of fraud.

The €1.66 million fine was not the only fine levied on inadequate controls against cyber fraud over the last few months. Capital One was fined $80 million by the US Office of the Comptroller of the Currency for failing to establish effective cyber risk assessment processes from 2015 (see above).

 

In Focus: Fattest finger first – clumsy digits make chance millionaires

Some make-up companies really can make you feel like a million dollars. In one of 2020’s largest op risk losses so far, Citi accidentally sent approximately $900 million to a group of Revlon lenders when making accrued interest payments on loans to the cosmetics giant. After notifying the lenders that the transfer had been a mistake, Citi was able to recoup some of the $900 million. But some refused to return the funds, leaving a gap of around $411 million.

The bank has blamed the fat-finger flub on a clerical error, which resulted in it making hundreds of payments to hedge funds for amounts roughly 100 times larger than they should have been.

Citi’s blunder is slight in comparison to other fat-finger errors. In total, ORX News has recorded 14 events where funds, often of more than a billion dollars, have been incorrectly transferred to third parties – the big difference being that most recipients played nicely and gave the money back.

 

The largest fat-finger transfer in the ORX News database occurred at Deutsche Bank, which accidentally transferred €28 billion to an account at Deutsche Börse’s Eurex clearing housing in March 2018 when conducting a daily collateral adjustment. The error should have been detected by an internal fail-safe system, known as a bear trap.

Ironically, the bear trap had been introduced after an internal audit resulting from another fat-finger error at the bank in March 2014. In this incident, an error occurred during the use of Deutsche Bank’s collateral management system. The bank’s control system at the time required transactions to be checked by a second employee, which failed. As a result, Deutsche Bank accidentally transferred €21 billion to Macquarie as collateral for an over-the-counter derivatives trade.

Luckily for Deutsche Bank, both amounts were reportedly recovered within hours and the bank suffered no major loss. However, such incidents highlight the need for enhanced processes and controls surrounding transfers. After Deutsche Bank’s 2014 incident, the bank introduced an enhanced fail-safe system to ensure that all payments exceeding a specified amount are subject to increased scrutiny. It was this control system that failed in March 2018.

Fat-finger errors have also rocked the stock markets. In October 2014, an anonymous broker accidentally entered an OTC trade on 42 stocks worth $617 billion. OTC trades are often made without an added level of scrutiny from an exchange, thus increasing the likelihood of fat-finger mistakes

So, what controls can be effective against these fat-finger errors? Controls often include pre-trade order size limits, which prevent block trades above a certain limit from being entered into a ledger.

The Markets in Financial Instruments Directive also sets out guidelines on systems and controls to prevent fat-finger errors in an automated trading environment. These regulatory requirements have been further strengthened under Mifid II, as requirements will apply not only to trading firms but trading venues.

Fat-finger errors in both trading and payment processing could have the ability to impact heavily on the stock market and cause huge losses to the responsible firm. And, as with Deutsche Bank, the controls put in place to prevent billions of euros from being mistakenly sent to the wrong counterparty can fail more than once.

In the case of Citi, when the controls against such mistakes fail, it is not always possible to recover the losses in their entirety, demonstrating the importance of strong controls to prevent large financial losses before they happen.

Editing by Louise Marshall

All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX, and we have not confirmed any of the information shown with any member of ORX.

While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.

Buy-side trading system of the year: Tradeweb

By Asia Risk staff | Analysis | 9 September 2020
Li Renn Tsai, Tradeweb

Asia Risk Technology Awards 2020

When it comes to trading in the capital markets, sourcing for liquidity remains one of the highest priorities.

Tradeweb helps asset managers, central banks and other institutional investors access the liquidity they need through a broad range of fixed income, derivatives and equities. Its electronic platform connects buyers and sellers of financial instruments and helps its clients find the best price and execution method for their transactions.

It works directly with liquidity takers and liquidity providers to create transparent and efficient ways to trade and provide solutions across the trade life cycle, including pre-trade, execution, post-trade and data.

In the past 20 years, Tradeweb has built a global business providing efficient and transparent access to liquidity across more than 40 products. It serves more than 2,500 institutional, wholesale and retail firms across more than 65 countries.

Its marketplaces facilitate trading across 24 currencies and asset classes, including rates, credit, money markets and equities. On average, Tradeweb facilitated more than $790 billion in notional value traded per day over the past four fiscal quarters, with real-time pre-trade pricing from over 50 leading liquidity providers.

Since launching its Automated Intelligent Execution (AiEX) tool in Asia in September 2018, institutional trading desks within the region use the tool to increase speed to market and reduce costs and operational risks.

The tool also enhances trading efficiency and capacity to focus on larger-sized transactions, essentially helping to unlock liquidity through more intelligent trading. AiEX allows traders to control all aspects of trading by automating how orders are submitted to the market and defining rules for accepting and rejecting trades. Asian trading desks, particularly when trading in large size, want to have flexibility over trade timing.

Tradeweb has a time-release feature that helps clients pre-plan and set up trade execution timing ahead of busy periods or non-local trading hours. This feature allows traders to shift their focus to other priorities.

Li Renn Tsai, Tradeweb

Li Renn Tsai, managing director and head of Asia at Tradeweb, says looking ahead, Tradeweb will continue to enhance client workflows. This is possible due to Tradeweb’s scale and breadth across asset classes, products and regions. It also collaborates with customers to build smarter trading technologies.

“For example, we constantly look at new ways to use our extensive and robust market data to better inform price discovery and order execution. Pre- and post-trade intelligence is key, particularly in this new work-from-home environment. Data is also crucial in supporting clients’ automation decisions via our AiEX tool, ultimately helping them achieve their trading objectives in different market conditions,” he says.

Traders can also quantify transaction costs using the Tradeweb proprietary transaction cost-analysis tool. Feedback from this same application can be used to iterate and refine execution rules following clients’ own best-execution policies. For example, in Japan buy-side dealing desks often have to physically document their adherence to best execution on a post-trade basis. If trades are executed via Tradeweb AiEX, this is no longer necessary, as compliance with local execution rules and regulations can be built into the automated process.

Tradeweb also operates an Asia-listed exchange-traded funds (ETF) marketplace, which replicates its request-for-quote (RFQ) trading model from Europe and the US. The Asia ETF platform is a natural extension for the growing ETF business at Tradeweb. It allows investors in all regions to tap into the main ETF markets in Asia, such as Hong Kong, Singapore, Taiwan and Tokyo.

Tradeweb was the first offshore platform to link with the Bond Connect programme, providing foreign investors access to China’s mainland fixed-income market. It recently launched a new electronic mechanism on the China Interbank Bond Market (CIBM Direct) – another access channel into China’s bond market – to allow foreign institutional investors to use the disclosed RFQ system.

As a result, it now provides fully electronic access to the two most popular northbound entry channels into China.

For the Bond Connect programme, Tradeweb has now increased the number of allocating sub-accounts from 30 to 50, to facilitate more foreign investor participation. This allows investors to trade up to 50 sub-accounts without having to split the order up, thereby increasing efficiency and saving execution time.

During the 12 months ending May 2020, the average daily trading volume in CNY cash bonds grew by 37% from the previous year.

An area of growth for Tradeweb is in emerging markets interest rate swaps. It currently covers 12 markets, including five in Asia. Tradeweb recently added an Australian exchange for physical (EFP) to its product suite in response to client demand.

It is also making trading workflows more flexible by developing new or expanding existing protocols, such as request-for-market. “Not having to show the direction they want to trade has really resonated with clients, who may be concerned about information leakage,” Tsai adds. 

Asia Risk judges agree that Tradeweb’s trading system is successful and is the most well-developed solution compared to the other submissions. One judge says: “I like AiEX, and the Asia-listed ETF market place. Also, Tradeweb’s Bond Connect initiative is great.”

Another judge adds that Tradeweb’s product is well established, and is highly utilised with market-leading transaction flows.

The changing shape of buy-side risk technology

By FactSet | Advertisement | 25 August 2020

Buy-side risk managers and FactSet’s global head of quantitative analytics gathered for a Risk.net webinar to discuss topical risk management trends for asset managers and to consider the industry challenges posed by the recent Covid‑19 pandemic

The Panel

  • Boryana Racheva-Iotova, Senior Vice-President and Global Head of Quantitative Analytics and Risk, FactSet
  • Racim Allouani, Head of Portfolio Construction and Risk Management, KKR
  • Lisa Wang, Director of Investment Risk Management, AllianceBernstein
  • Moderator: John Anderson, Contributing Editor, Risk.net

Intense competition, market volatility and a demanding regulatory environment continue to raise the stakes in investment risk management. As asset managers grapple with squeezed budgets and elusive sources of return, the unprecedented disruption caused by the Covid-19 pandemic has served as a painful reminder that future gains and innovation will rely on sound risk management principles across the full set of portfolio and compliance risks. 

Risk leaders must enable and support new investment platforms, data and analytics capabilities, operations and strategies, while ensuring their enterprises remain sound, secure and compliant. 

One discussion topic during the webinar was how specific trends that had already begun taking shape in the industry, had accelerated as a result of Covid-19. These trends include the requirement for reviewing the approaches towards building asset location mix, asset-liability management (ALM) and goals-based investing into wealth management. 

According to Boryana Racheva-Iotova, senior vice-president and global head of quantitative analytics and risk at FactSet, the socioeconomic and geopolitical uncertainty caused by the pandemic has already resulted in much more sophisticated approaches to building asset allocation mix and ALM. 

 “A lot of acceleration has been observed within the solutions and advisory groups within the asset management community, as well as the trends towards shifting assets under management that are outsourced, chief information officers, and so forth,” she said. 

She addressed FactSet’s focus on supporting firms and risk managers through ongoing change and upcoming trends: “We continue to see those trends being quite strong and, through that, the need to support our clients with data, models and solutions to execute on those activities more efficiently,” she added. 

There has also been a greater focus on building a more holistic understanding of risk, specifically liquidity and credit risks. A key question for the market right now is how the recovery from the Covid-19 pandemic will happen, and what the recovery process will look like. 

Given the market volatility during the crisis, asset managers have needed to get a better grip on understanding alpha and upside potentials, and must be able to take advantage of these opportunities. 

 “We see much more attention on more detailed analysis of alpha as well as risk, really understanding exposures extremely well and what can lead to disruption and dislocation of the exposures, dislocations in the correlations between different asset classes, as well as between securities and types of risk drivers,” said Racheva-Iotova.

New approaches, new trends

A newer approach to risk has also been a major trend. There has been increased attention on risk budgeting, particularly tail-risk budgeting, tail contribution to risk from factors, as well as a group of assets’ securities. 

 Deliberative risk management is also getting a lot of attention. This approach requires special tools and risk models that necessitate full repricing to capture all of the non-linearities that can come with particular trades, as well as robust stress-testing. 

 “In terms of stress-testing, we see hugely increased interest in the types of stress tests, as well as the complexity of the stress tests being built,” said Racheva-Iotova.

Stress-testing has also evolved for AllianceBernstein. Just as different levels of aggregation are being included more, stress tests are being incorporated into earlier stages of the portfolio construction process. For example, if a particular amount is allocated to a specific position, strategy or sector, there needs to be an analysis of the various levels of risks shown within the stress tests.  

“Those types of risks get aggregated at an earlier stage of the portfolio construction. So that is one of the things that has been evolving and has really accelerated post-Covid-19,” said Lisa Wang, director of investment risk management at AllianceBernstein. 

For Racim Allouani, who oversees portfolio construction and risk management at KKR, the main risk management focus has been ensuring companies in the private market have enough liquidity to survive the current pandemic, as well as to potentially keep going during a further shutdown should things change in the future. 

 “We might have to pick our battles, and what we are a little concerned about in the post-Covid world, is that we will probably see some attrition and higher defaults. It might be concentrated in some pockets of the market,” he said. 

In credit, the recovery – or lack of – has varied depending on the specific sector. Retail, travel, leisure and energy have been hit materially, while other sectors, such as pharma, utilities and tech, are outperforming. While orderly and expected, these dispersions have been crucial to monitor and then capitalise on from a risk management perspective for KKR. 

For other firms, there has been a gradual increase in demand for a more comprehensive risk analysis. At AllianceBernstein, the risks at the portfolio level, as well as various layers of aggregation –whether it be at country, sector or strategy level, individual stock or individual positions level – must all be factored into risk management analysis and decisions.

 That has developed into looking through “multiple lenses” of risk, said Wang. 

“The other trend we have observed is to bring in a more integrated risk view, in the sense that we care about the allocation to individual stocks, we care about the risk contributions coming from individual positions, we care about what their stress-test characteristics are like, we care about their liquidity. So, for portfolio construction, we are looking through multiple lenses of risks as well,” she said. 

The growing importance of data

To be able to manage and review this comprehensive risk analysis means there is a greater demand for larger datasets and for predesigned sets of reports into data in the most consistent way possible. This helps risk managers stay on top of volatile sectors too. 

The importance of data is a growing trend among buy-side risk managers, particularly as the use of machine learning and artificial intelligence is on the rise. More data and more diverse data is continually needed as the spectrum of risk drivers grows. 

“But that data needs to be useful data, and it should be data that helps us to isolate the risk-related signals instead of just introducing noise into the risk modelling process,” said Racheva-Iotova.

For FactSet, risk data involves two perspectives: the breadth and quality of datasets, and lookback periods that can be meaningful. For example, the relationship between equity markets and credit default swaps (CDS) 20 years ago is hard to fathom because there may be a lack of data from that period and some markets might not have existed, such as CDS. Machine learning can help in these circumstances. 

Alternative datasets are becoming increasingly important, and can constantly change the risk parameters as the market landscape becomes more complicated and new types of risk forces and risk drivers are constantly changing the profile. 

“Those are alternative new datasets that are definitely helpful, but, again, you need to have a particular purpose, you need to have a particular goal, and then certainly look for the right datasets,” said Racheva-Iotova. “In some instances, the datasets themselves will first of all have to be analysed through machine learning techniques in order to extract the relevant signals before incorporating them into the risk modelling.”

The use of big data has been particularly helpful to KKR during the Covid-19 pandemic. The firm used ‘high-frequency data’, such as the patterns of credit cards, spending, reopenings and hospital data, and a lot of big data that had not typically been tapped into to any great extent. 

“This helped analyse which parts of the economy would be reopening faster. We were doing that in China, for example, because they faced the whole crisis before the West. So we have been using much more data, including alternative data, in this episode of Covid-19, even for the private side,” said Allouani. 

Managing risk in today’s world of asset management sits alongside bringing on the right technology solution. Larger businesses have an advantage given their deep pockets, and Allouani feels it could be “difficult” for small and medium-sized players to remain relevant without combining the right technology with the necessary talent pool. 

“It needs to go hand-in-hand with the necessary talent who can understand the benefits of the technology, apply it for the purpose of the particular asset manager and for the purpose of the investment approaches and investment mandates that they have,” he said. 

Theoretical innovation

While theoretical innovation has been around for decades, especially for public markets, implementing it practically within an organisation and its governance is the harder aspect of a risk manager’s job.

“This is the type of proven innovation that is a must for everybody. This type of innovation is something that needs to be observed within the risk management process, no matter the size of the asset manager. Some asset classes can be managed with relatively accessible technology and others need more customisation and out-of-the-box thinking and implementation,” said Allouani.

For some small to medium-sized players that want a comprehensive set of risk analytics but don’t necessarily have the same budget as some of the bigger players, the key is to plan this in a more structured way. 

 Technology can also be a strength in this kind of scenario. For example, business intelligence-type analytics can help link with the risk data. The type of processing that is generated from this business intelligence software can save businesses a lot of time in terms of building their own risk presentations.

“If you want to view different layers of risk in your individual portfolios, into your individual sectors and strategies, business intelligence software is a tool you could actually use for the small to medium-sized players. So you don’t have to build everything from scratch with a large budget dedicated to a technology development effort,” said Wang. 

Technology has also played a role for AllianceBernstein from an operational risk perspective during the Covid-19 pandemic, and accelerated another trend that was already starting to take hold at the asset manager – a concerted effort to move to a virtual work environment, which has picked up pace since March. 

“Within our own firm we have been migrating to a virtual work environment even before Covid-19. So, even when we are in the office, we don’t have to log in from a particular desktop. We have virtual desktops set up so we can log in from anywhere within our building. Post-Covid-19, we are logging in through a virtual private network, or VPN, but the virtual desktop environment has already been set up so that transition itself is seamless,” concluded Wang.

 

Watch the full webinar, The changing shape of buy-side risk technology and the role of data for risk managers

The panellists were speaking in a personal capacity. The views expressed by the panel do not necessarily reflect or represent the views of their respective institutions.

Leveraging technology to address modernisation challenges faced by bank treasuries

By Red Hat | Advertisement | 21 August 2020

Bank treasurers and technologists convened for a Risk.net webinar in association with Red Hat to consider how technological innovation could help treasury functions meet rising expectations.

Vincent Caldeira, Red Hat

Confronted with unprecedented challenges, bank treasury operations must adapt – and modernise. Upgrading outmoded infrastructure and leveraging new technologies should be the first port of call for managers.

Right now, the role and mandate of a treasury is broader and more complex than ever before. Not only do they have to fulfil their traditional functions doling out funding and monitoring investment risk, they’re increasingly being called upon to manage capital and leverage constraints in response to regulatory and market pressures.

Senior executives are also pushing treasuries to generate more revenue, adding one more strategic objective to the list. 

Many treasuries, however, are not structured appropriately to assume these new duties. Nor is their technology infrastructure fit for purpose. Embracing the opportunities afforded by a refreshed service-oriented architecture and the capabilities of cloud computing, without compromising on resilience, could go a long way towards meeting these challenges. 

“Treasuries are really now at the point where they need to modernise and build new capabilities very fast and be very agile. At the same time, they cannot really afford to compromise their platform’s amount of resilience and capacity. Typically, this is the toughest challenge from the technology point of view because it is hard to be stable when you need to change fast at the same time,” said Vincent Caldeira, chief financial services technologist Asia-Pacific at Red Hat.

Caldeira explained that treasury platforms should adopt technologies that facilitate a distributed service architecture, which allow different systems to operate more independently from one another and scale separately. This will enable treasurers to decouple different services and make them self-sufficient, without compromising on capacity or stability – ideal for handling multiple complex tasks simultaneously. 

Tailoring capabilities to functions 

Capabilities can also be tailored to different functions more efficiently this way. For example, e-trading requires the rapid processing of quotes and orders – meaning speed is king. Liquidity management, on the other hand, is a volume-heavy process, placing the emphasis on data-crunching capacity. 

“Once you have managed to breakdown your big monolithic block into a series of self-contained services, then you can put the technologies where it makes more sense,” said Yoann Vandendriessche, senior product directory, treasury and capital markets at Finastra.

Adopting ‘containerisation’, as this software trend is known, also opens up opportunities to better automate routine tasks. Finastra conducted a survey of treasurers, which found that around 65% of their time is spent on manual activities, meaning there’s huge scope for automation to ratchet up productivity and reduce costs. 

Treasuries also need to build business agility through the ability to quickly react to new market conditions in real time. To this end, Caldeira recommended treasurers adopt event-driven data pipeline systems, which can ‘stream’ data instantaneously to decision and risk management systems, but also greatly ease the process of integrating with other departments – such as finance, compliance and risk management.

Migration to cloud

Cloud technology offers capabilities that can assist with all of these transformations, though it’s no silver bullet. One model recommended by Vandendriessche has a treasury retain ‘mission-critical’ functions in-house and migrate so-called adjacent capabilities to a cloud, so the bank can delegate non-core, yet demanding, tasks offsite and focus internal resources on those responsibilities key to the department’s success. 

“It is very important to have in mind that the migration to cloud is not an obligation, but an option,” explained Vandendriessche.

Investing in new capabilities and innovating workflows, however, could be a tough sell against the backdrop of a grinding recession and elevated concerns about cyber security and operational risk. Cost budgets are predicted to tighten considerably, with one recent survey conduced by the International Monetary Fund revealing that 95% of banks in advanced economies believe their return on equity will be below 10% by 2025, and 20% saying it would be negative. 

Identifying specific use cases for technology upgrades could be one way to get managers to open their wallets. “Bank treasurers wake up to the value of innovation when they see a use case that really delivered something tangible for them. For instance, if they want to improve their foreign exchange exposure reporting accuracy or their cash reporting accuracy, and they see it is possible to have a solution in six weeks rather than in some distant future, they will get very excited about [that],” said Nick Wood, managing director of FinTorque. 

What should also appeal to senior management is that investment in a modern, robust treasury infrastructure today should prevent costs racking up tomorrow, handling security incidents and operational failures.

“Very often I see a huge investment into security tools for detection, for incident handling, but lesser investment into actually making sure that, when new services are built, security is built into the architecture of the platform to really limit the exposure [in the first place],” said Caldeira.

Treasurers should, therefore, consider technologies that are built to be robust – rather than those with protections grafted on afterwards. Platforms that are tough, resilient and flexible are needed to power the evolution of the treasury function and, when it comes to something so essential to a bank’s performance, managers may well ask: why compromise?

Mega-hedges and generational strife at PGGM

By Duncan Wood | Feature | 18 August 2020

Buy-side risk survey: for Dutch pension giant, battle between young and old shaped response to March mayhem

This is the tenth in a series of articles connected to our buy-side risk survey. Click here to read the rest of the series.

Arjen Pasma is 46 years old. In the Netherlands, where Pasma lives, it’s an age with a special significance – the tipping point when an individual’s slowly growing pension entitlement starts to outweigh the flat premium paid throughout that worker’s career.

It means Pasma – the chief risk officer of €250 billion ($296.3 billion) pension giant PGGM – has a personal stake in the long-running debate about how to reform the Dutch pension system and its huge, collectively owned pots of assets.

“Because younger people can take on more risk than older people, when you’re young you’re basically paying too much premium for the pension drawing rights you’ve built up,” says Pasma.

“The tipping point is around 45 or 46, so I’ve effectively paid too much premium for my whole career – but if you reset the system right now, where is all of that wealth going? It may look like it’s going from one generation to another. That wealth transfer needs to be corrected: everyone needs to get the pension they paid for.”

For more than a decade, that goal has consumed – and divided – the Netherlands, where as much as 90% of all pension assets are invested in the public system. Pasma and others are hopeful a proposal to create individual pension accounts has enough support to make it into law.

The problem with the current defined benefit system is that the benefits are not defined enough. As the official discount rate – the “rekenrente” – has slumped towards zero over the past decade, Dutch pension funds have seen the present value of their liabilities balloon and their coverage ratios slip – at the fund belonging to PGGM’s largest client, the ratio has dropped to 85.9% from 96.5% in 2019. This crimps payouts to today’s pensioners.

A tempting solution is to hike the discount rate – a position energetically championed by 50Plus, a political party launched in 2009 specifically to advocate for current pensioners. That would result in higher payouts for them, but would also drain the pot available to younger workers.

The result is what Pasma calls “intergenerational conflict” – a struggle between young and old that threatens the ‘we’re-all-in-this-together’ ethos of the public pension system.

“If you assume a rate of 4%, then the coverage ratios of most pension funds would increase substantially – the average duration of the liabilities is around 20 years. You could index returns to inflation and even start topping up past payments,” says Pasma. “But then you start to pay out a huge amount of wealth to a baby boom generation that is already quite wealthy, and the payouts would come from what is currently the buffer for the young generation. That is where you literally start to see intergenerational solidarity tested.”

The solution currently making its way through the Dutch legislative process would, for the first time, create individual accounts for pensioners. It would complement these with a buffer account, which would be plumped up primarily by siphoning off some returns: “It’s basically to compensate those generations that are relatively unlucky; the generations that are relatively lucky will pass on some of their returns,” says Pasma.

These reforms – slated to take effect in 2026, with a 10-year transition period to follow – are a huge deal for the Dutch pension industry, and for millions of existing investors. As PGGM’s 2019 annual report puts it: “In the coming 10 years, more is likely to change in the Dutch pension landscape than in the past 40 years.”

Tanker in a storm

This shifting backdrop is crucial to understanding PGGM’s attitude to risk. For example, intense scrutiny from regulators, and a well-informed public, influenced some of the firm’s behaviour in March, when the spreading coronavirus triggered mayhem across the financial markets.

On the one hand, PGGM is generally able to look past these gyrations.

“We don’t try to do all kinds of short-term trades, it’s not our policy. We just have to make sure we navigate this tanker through the storm without suffering too much damage,” says Pasma.

It hasn’t been a terrific year returns-wise, but we’ve done reasonably well so far

Arjen Pasma

On the other, it wasn’t at all clear when or how March’s storm would end, which prompted some discussion about the tanker’s course. PGGM generally reviews and rebalances its books on at least a monthly basis, often topping up asset classes that have sold off and now look cheap, Pasma says. During the Covid-19 volatility, this strategy was questioned by the firm’s board. 

“It’s a classic behavioural finance issue – your risk appetite is reference-point-dependent, right? So, when you have a coverage ratio of 99%, which was the case at the end of December, it was fine to buy equities in a falling market. But when interest rates also decline a lot, leaving us two months later at a coverage ratio of 84%, there were some board members who had concerns about adding more risk – it was the right conversation to have,” he says.

Ultimately, the firm left its rebalancing process untouched. The decision paid off, as stock markets rapidly clawed back their March drawdown – helping offset losses in what Pasma describes as “quite a big” commodities book.

Around 30% of the firm’s assets are tied up in private markets. These are large infrastructure, insurance-linked, and risk-sharing transactions with an average deal size of around €200 million – as well as private equity deals that range up to roughly €100 million. By their nature, these are not positions that can be quickly traded if markets turn south, but Pasma says the early indications of asset price declines are not as bad as feared. In commercial property, PGGM had less exposure to shopping malls and offices – which have been hit harder by the lockdown – than to logistics-focused facilities, which have proved more resilient.

In all, he says: “It hasn’t been a terrific year returns-wise, but we’ve done reasonably well so far.”

“Assume nothing”

March posed other threats to the PGGM tanker. In the Netherlands, lockdown was more selective than in many other European countries – schools were quickly shut down, but many businesses remained open, essentially betting that distancing and hygiene measures would be enough to control the spread of the virus.

One of the first actions PGGM took was to obtain dispensation from the government so that if the lockdown was extended, the firm would still be allowed to keep a skeleton crew at its Zeist headquarters, not far from Utrecht.

“We wanted those people to have a stable connection and a Bloomberg machine, rather than working from a laptop at home,” says Pasma.

So far, that dispensation has not been needed. Although the Netherlands, like many other European countries, has recently tightened up its restrictions, it never applied a nationwide stay-at-home rule. As a result, roughly 80 of the 450 staff in PGGM’s investments division – which includes Pasma’s team of 45 – were in the office at any time during the first months of the crisis, rotating between home and remote working.

Having the bulk of staff working from home did require the now-familiar change of working practices and patterns, of course – ensuring remote staff had the hardware, software and connectivity to do their jobs without interruption, and convening regular meetings to help co-ordinate responses to the global panic. PGGM’s freshly formed financial crisis team started meeting in early March, just before the first headline-grabbing falls in oil and stock markets.

There was also a change of emphasis, with Pasma pressing his team not to take anything for granted.

“We made sure people were following protocol, following processes. You cannot assume anything about what your colleague might or might not have checked. You can’t see it, you can’t look over your shoulder and call ‘Have you checked this?’ and get a ‘Yes, I have’ nod in reply. So, assume nothing. And that worked really well. We had no operational incidents,” he says.

The focus on operations and operational risk extended to third parties, of course – where there was at least one slip-up involving a third-party manager of some PGGM assets.

“It was a typical operational incident in which a rebalancing in the portfolio was not executed the way it was supposed to, due to communication issues. Because of the highly volatile markets, correcting such a mistake can be costly. We realise these errors occur, but it was a wake-up call for us that such an error could also have happened on our side,” says Pasma.

Alongside operational exposures, PGGM also kept a closer eye on liquidity risk during March.

Because PGGM hedges the impact falling rates have on its coverage ratio, the firm started receiving huge amounts of collateral – up to €1 billion in a day, Pasma estimates – while simultaneously paying out to counterparties on its nosediving commodity positions. The firm’s trading and treasury teams were “extremely busy” checking the margin totals and chasing the incoming assets, making sure the payments it was owed actually arrived.

In another liquidity flap – after comparing notes with other pension managers – the firm withdrew all of its assets from one of the third-party funds in which it parks liquidity.

“If these funds put all of their assets in overnight cash, then you’re losing money at the moment. So what they do is hold some in tradeable securities. We were paying very close attention to the liquidity profile of these funds – and we did end up replacing one particular fund with another because we were not convinced that it was sufficiently liquid. And because we were seeing these huge swings, particularly in our quality book and our rates book, our risk tolerance for liquidity was basically zero,” says Pasma.

The mother of all hedges

Away from the pandemic, life goes on. PGGM’s huge interest rate derivatives book is an oil tanker in its own right, and currently requires some deft handling. Pasma won’t say precisely how big the book is, but the DV01 – its sensitivity to a one-basis point move in rates – is “north of €100 million”.

He also puts it more simply: “It’s humongous.”

It faces three big changes – first, global reform of the Libor family of interest rate benchmarks, and other benchmarks based on interbank offered rates, which are referenced in the vast majority of outstanding swaps and swaptions. Second, the ongoing push to centrally clear more of these over-the-counter trades. And finally, the new Dutch pension system and its consequences for the official discount rate.

The probable death of Libor, which could happen as early as the end of 2021 – and could be confirmed later this year – will not just affect the portfolio directly, but also every process and system that currently references a Libor rate. Over the past year, PGGM has been conducting what Pasma calls an “x-ray” to identify all of these little, hidden Libor-dependencies.

Some turned up in unexpected places – for example, in a formula used by PGGM’s human resources department to calculate deferred employee bonuses. Others were where you would expect. The principal challenge was that there were lots of them.

“It’s your entire fixed income operation and the back office where you have to value them; actuarial models, where you have to calculate liabilities and the present value of liabilities; all your valuation models where you perform discounted cashflow modelling; your risk proxies, because you’re using some model with a risk-free rate component; your quant and factor strategies that might have an interest rate component; scenario modelling, where some of your scenarios incorporate interest rate shocks. And so on,” says Pasma.

In all, he says, PGGM has more than 100 systems with some form of Libor or Ibor reference. For most cases, the fix is easy – a link to the outgoing Libor or Euribor rate will be replaced by its successor benchmark and the system will continue working almost seamlessly. That’s not true everywhere and Pasma says the firm has “spent a considerable amount of time” working out which models would break when the old reference rate disappears.

Something we have to learn is that people had been warning about the risk of a pandemic for many years – and we chose, collectively, to ignore it. That makes you think about other signals we have chosen to ignore

Arjen Pasma

On the clearing front, PGGM and other European pension funds currently have an exemption from the European Market Infrastructure Regulation, which requires users of many interest rate swaps to send their trades to a central counterparty (CCP). Politicians sought to protect the sector after hearing concerns that keeping a stock of liquid assets on hand to meet daily margin calls would erode pension returns – or could, in the case of severe stress, force funds to try and liquidate some investments. 

The exemption is due to expire in mid-2021. PGGM has begun clearing some trades – pushed that way by market forces rather than regulation – but the old complaints haven’t gone away.

“Even if you have an exemption, when fewer and fewer banks want to trade with you OTC but they do want to trade cleared, then you don’t really have a choice,” says Pasma.

He adds: “For us, it’s an increased liquidity risk. One of the strengths of pension funds is that they can be relatively illiquid and they are long-term investors, but if you have this huge derivatives book – and you have to clear it – then it makes you a short-term investor, because you have to keep so much in cash to meet variation margin calls.”

But perhaps the incoming changes to the Dutch pension system will enable PGGM to run a smaller derivatives book. Because the system is dropping the pretence of guaranteed returns, it is also expected to drop the rekenrente – the single discount rate.

“It will, most likely, be based on forward-looking returns on a portfolio level – instead of a single risk-free curve, but the details have to be figured out,” says Pasma.

Dutch pensions specialist Cardano has been trying to figure out some of those details. Roel Mehlkopf – a pension fund adviser with the firm in Rotterdam – says the expectation is that there will be less demand for ultra-long-dated interest rate hedges, while inflation hedging could increase.

“The reason is that in the new system, you can determine interest rate hedges separately for each age group. At the moment, you have one big funding ratio and you might hedge 50% of your exposure to it. In the new pensions contract, you can have a higher hedge ratio for the elderly – to give them a stable outcome – but a lower one for younger generations if you believe hedging is not in their interest. If that’s what happens, then hedging will be more focused on shorter horizons – 10, 20 and 30 years,” he says.

Neglected risks

Will the pandemic also have a lasting impact? Pasma believes so. 

“Something we have to learn is that people had been warning about the risk of a pandemic for many years – and we chose, collectively, to ignore it. That makes you think about other signals we have chosen to ignore,” he says.

Climate change fits the bill – a risk Pasma believes is being underestimated, rather than ignored – but he has other neglected exposures in mind, too.

I very much believe that if you are a long-term investor, you should really know and understand what risk factors are driving your portfolio

Arjen Pasma

“I’m thinking about technology and robotisation and the huge number of jobs that will be replaced. What will happen there? And other mega-trends, such as mass migration, big demographic changes, and other negative impacts of technology change. Ultimately, these things will result in certain industries and companies being very successful, and others being unsuccessful. And we are investors in all of these companies today, so we need to know about these risks,” he says.

Perhaps this is the trade-off for a pension fund CRO. When markets go into freefall, they have a slightly easier ride than a hedge fund risk manager – they won’t be overseeing frantic attempts to hedge a failing strategy, rapidly liquidate a portfolio, or placate jumpy investors. Their performance over a week, a month, or even a quarter is not going to make or break them. But as that horizon extends, the risks that need to be measured become far more awkward – a greater degree of uncertainty creeps in.

Pasma seems to have made his peace with this: “I very much believe that if you are a long-term investor, you should really know and understand what risk factors are driving your portfolio. Good risk management is not only about quantification, having flashy models. It’s also about common sense, sitting together with your investment strategists and especially – most importantly – having an open mind about the stories that until now we’ve chosen to ignore.”

Basel sets out its stall on operational resilience

By Steve Marlin | News | 13 August 2020
The BIS

Body says banks expected to maintain critical services, but steers clear of setting compliance metrics

The Basel Committee on Banking Supervision has begun consulting on a guiding set of principles on operational resilience for financial firms – teeing up a period of rulemaking designed to ensure that banks can continue to maintain critical services through pandemics, large-scale system outages or natural disasters.

The principles are vague on how compliance with such expectations will be measured, however – an acknowledgment that different countries will likely take differing approaches to enforcing the rules, such as setting timed targets by which a firm must return to operational functionality following an outage.

“Regulators have been careful at not pushing out too much national guidance. Many held off until this came out. There clearly is an effort to co-ordinate between international regulatory agencies,” says Evan Sekeris, head of model validation at PNC Financial Group, and a former US supervisor.

In some Group of 20 countries, the rulemaking process is well under way. The UK Prudential Regulation Authority (PRA) last year issued a consultation on operational resilience, with the deadline for comments extended to October 1 this year because of the Covid-19 pandemic. The watchdog said firms would not need to meet any new requirements resulting from the consultation, which will apply to all UK banks and larger insurers, before the end of 2021.

Operational resilience is also mentioned in a set of rules governing third-party risk issued last year by the European Banking Authority.

The Basel paper establishes that resilience is a multidisciplinary effort, involving operational risk, business continuity planning, IT, vendor risk and governance. It asks banks to map the interconnections between resources necessary to deliver critical operations.

The Covid pandemic has highlighted the importance of resilience as companies adapt to staff and customers working from home, deal with increased risk of internal and external fraud, and rethink their need for backup facilities. Banks have surprised with their ability to maintain services during the crisis: online banking and call centres have stayed open, and markets and payment systems have functioned.

By creating two separate documents, [the Basel Committee is] enforcing the idea that resilience is not merely an enhancement of op risk management

Evan Sekeris, PNC Financial Group

“Ironically, these are the very developments that are causing the Basel Committee increased concern,” notes the head of operational risk at an international bank. “Although this technology has made banking more resilient during a pandemic, it’s created concentrations of data and single points of failure.”

The Basel Committee did not specify what metrics should be used for measuring resilience, instead noting that work on metrics was at an early stage and requested feedback from banks on what metrics they use. The UK has already ploughed its furrow, however, asking banks to set ‘impact tolerances’, including time limits on return to operations following a disruption or outage to a critical business service. Companies will be expected to conduct self-assessments of their operational resilience, and communicate the results to the regulator.

Arthur Lindo, chair of the Basel Committee’s operational resilience working group and a deputy director at the US Federal Reserve, told a Risk conference last year that the Fed was not inclined to be prescriptive when it comes to setting impact tolerances. The Fed declined to make Lindo available to comment.

Evan Sekeris, PNC Financial Group

“Many people see metrics as important, but others say time to recovery is important for business continuity – but when you’re talking about resilience, your metrics have to change,” says Sekeris.

The Bank of England’s approach was, in part, an explicit response to instances of high-profile tech failures at UK banks TSB and RBS, which left millions of customers unable to access bank accounts and other services – in some cases, for more than a week.

Basel’s approach follows the UK in expecting firms to construct scenarios to test their ability to maintain critical business services in the event of a severe but plausible disruption of operations. The UK paper is somewhat more detailed in specifying how they should go about this, however. For example, plausibility could be determined by modelling incidents or near misses that have occurred, and firms would be permitted to remain outside their impact tolerances for extreme situations such as a failure of essential infrastructure.

Banks have been actively revising their operational risk scenarios since the pandemic struck, incorporating pandemic risk into scenarios for other operational risks, such as rogue trading, mis-selling, fraud and physical attacks.

Some view the Basel Committee’s less prescriptive approach as a recognition of the progress that banks have made in resilience planning.

Andrew Sheen, risk management consultant

“We already have a scenario process. We have a business continuity process, and a lot of organisations have made a lot of how they dealt with Covid based on business continuity. The committee recognises that banks have well-established risk management processes,” says Andrew Sheen, a risk management consultant.

Concurrent with the operational resilience principles, the Basel Committee also issued updated principles for the sound management of operational risk (PSMOR), which had last been revised in 2011. The most significant change has been to explicitly extend existing best-practice expectations such as the classic three lines of defence approach to op risk to other facets of a bank, such as its IT infrastructure.

By issuing two separate papers, the Basel Committee is sending a message that operational resilience and operational risk are separate concepts, argues Sekeris. The former is concerned with keeping the lights on, while the latter is concerned primarily with measuring financial and non-financial impacts.

“If you look at the op resilience document, they highlight that op risk management is one of several elements, along with third-party risk, business continuity and governance. They’re sending a message that op resilience is a cross-disciplinary effort. By creating two separate documents, they are enforcing the idea that resilience is not merely an enhancement of op risk management,” he says.

Basel’s work on updating the PSMOR had sat dormant after being shelved in acrimony, multiple sources say, following the transatlantic split that emerged between US and European regulators over the Basel Committee’s decision to axe op risk capital modelling.

The comment period on both consultations concludes on Friday, November 6.

Goldman’s op RWAs climb $13bn on 1MDB settlement

By Louie Woodall | Data | 12 August 2020

Operational risk-weighted assets (RWAs) surged 12% to $132.5 billion at Goldman Sachs over the three months to end-June, anticipating the supersized settlement reached with the Malaysian government in July over the 1Malaysia Development Berhad (1MDB) fraud.

It’s the highest level climbed by Goldman’s op RWAs on record, and the biggest one-quarter increase since Q4 2015. The jump accounted for nearly two-thirds of the overall increase in advanced approaches RWAs at the bank over Q2.

In its quarterly filing, Goldman said the “vast majority” of the increase was linked to “litigation and regulatory proceedings” – of which the 1MDB case was the most significant outstanding. Total cash set aside to cover these proceedings amounted to almost $3 billion in Q2. 

 

On July 24, Goldman agreed to pay $2.5 billion to resolve the 1MDB proceedings and to return at least $1.4 billion in gains made from assets linked to 1MDB.

Among other US global systemically important banks (G-Sibs), only Wells Fargo also saw its op RWAs increase quarter-on-quarter, by 1% to $340.2 billion.

State Street saw its op RWAs fall the most percentage-wise over the three months to end-June, by 5% to $44.2 billion.

 

What is it?

Basel II rules lay out three methods by which banks can calculate their capital requirements for operational risk: the basic indicator approach; the standardised approach; and the advanced measurement approach (AMA). The first two use bank data inputs and regulator-set formulas to generate the required capital, while the AMA allows banks to use their own models to produce the outputs.

Why it matters

Op risk rules mean Goldman was hit with a one-two punch on settling the 1MDB case. Not only did it have to pay out a huge sum of cash to settle proceedings, the bank also had to include the loss in its AMA model, which produced record-high op RWA amounts, equivalent to a regulatory capital charge of some $10.6 billion.

However, the regulatory capital burden of op risk may no longer preoccupy Goldman, or any US G-Sib, for that. Why? Because of the introduction of the stress capital buffer, the Federal Reserve’s new methodology for setting minimum capital requirements. The SCB is calculated off of banks’ standardised RWAs only. Since the standardised approach does not include op risk, Goldman and its peer banks are technically not bound by their op RWAs if their capital requirement is higher under the SCB than the old advanced approaches capital conservation buffer.

Still, things will change once the finalised Basel III package is fully implemented, and a souped-up op risk methodology is included in the revised standardised approach. This means the capital hangover of the 1MDB scandal could linger for many years. 

Get in touch

Sign up to the Risk Quantum daily newsletter to receive the latest data insights.

Let us know your thoughts on our latest analysis. You can drop us a line at louie.woodall@infopro-digital.com or send a tweet to @RiskQuantum.

Tell me more

Op risk data: Goldman 1MDB settlement swells 2020 loss tally

Modelled RWAs diverge from standardised at Goldman Sachs

Fed’s approach to stressing op risk frustrates banks

Banks eye post-pandemic shake-up of op risk scenarios

Though Covid crisis rages, US banks’ op RWAs fall

View all bank stories

Op risk data: Goldman 1MDB settlement swells 2020 loss tally

By ORX News | Opinion | 11 August 2020
Goldman Sachs headquarters, New York City

Also: Deutsche fined over Epstein KYC failings; collateral fraud in focus. Data by ORX News

Jump to: In focus: Collateral mismanagement | Spotlight: Aussie banks pay for underpaying staff

July’s largest loss saw Goldman Sachs announce that it had reached an agreement with the Malaysian government to settle criminal proceedings relating to the bank’s involvement in the 1Malaysia Development Berhad fraud. Goldman Sachs agreed to pay $2.5 billion to resolve the proceedings – the government had been seeking as much as $7.5 billion – and agreed to return at least $1.4 billion in proceeds from assets linked to 1MDB.

Goldman’s fine is by far the largest op risk loss recorded by ORX News during 2020 to date, dwarfing all losses reported between March and June in its own right (see figure 1). In part, this reflects an abnormally quiet period for reported losses during Covid-enforced lockdown, when regulatory penalties and settlement activity slowed to a crawl, and no large fines were meted out.

1MDB was a sovereign wealth fund created in 2009 by the Malaysian government, which lost up to $5.7 billion as the result of a $4.5 billion fraud carried out by its executives between 2009 and 2015. According to the US Securities and Exchange Commission (SEC), former Goldman Sachs executive Tim Leissner helped the fund raise $6.5 billion in bond offerings.

Through 2012 and 2013, Leissner and other senior executives at Goldman Sachs worked to obtain and retain business from 1MDB for the benefit of Goldman Sachs through the promise and payment of bribes and kickbacks to government officials in Malaysia and Abu Dhabi. These payments were financed, in part, by embezzled proceeds from the bond deals.

Over the course of the scheme, the SEC estimates that Leissner and others misappropriated more than $2.7 billion, used as bribes and kickbacks to those government officials. Leissner also misrepresented transactions and assets in Goldman Sachs’s books and personally received more than $43 million in illicit payments for his role in this scheme, the SEC claims.

In return, the government agreed to withdraw criminal charges and not bring further charges against the bank, its subsidiaries and its staff, excluding Leissner and Robert Ng, who were charged by the US Department of Justice.

 

In July’s second-largest loss, Deutsche Bank agreed to pay $150 million over compliance failures relating to its relationship with the late convicted criminal Jeffrey Epstein and with Danske Bank Estonia and FBME Bank.

The New York State Department of Financial Services (NYDFS) found that Deutsche Bank had failed to properly monitor account activity conducted on behalf of Epstein despite the publicly reported information about his criminal misconduct.

Despite the bank marking its relationship with Epstein as “high-risk” and designating him an “honorary political exposed person”, it did not subject his accounts to the necessary enhanced due diligence. As such, several payments to individuals, who were publicly alleged to have been co-conspirators in Epstein’s crimes, received payments from his accounts at the bank, which were not subject to enhanced due diligence.

The NYDFS’s fine also relates to Deutsche Bank’s relationship with Danske Bank Estonia and FBME Bank. The NYDFS found that Deutsche Bank had not acted on red flags concerning these banks’ anti-money laundering measures. Both banks were deemed high-risk, and Deutsche Bank was warned of the risks of doing business with these banks. However, Deutsche Bank regularly processed billions of dollars’ worth of payments on behalf of the banks.

Valic Financial Advisors saw July’s third-largest loss, paying $40 million to settle two actions filed by the SEC. The SEC alleged that VFA failed to disclose that its parent company paid for VFA to be promoted to teachers, and secondly that VFA failed to disclose conflicts of interest regarding financial benefits it received as a result of its choice of mutual fund investments.

As a result of this conduct, VFA received over $13.2 million in financial benefits. As of July 28, 2020, VFA had repaid clients approximately $2.3 million in fees, plus interest.

The first order required VFA agree to a censure, refrain from future violations and pay a civil money penalty of $20 million. The second order required VFA pay $20 million in disgorgement, pre-judgement interest and a civil money penalty.

 

July’s fourth-largest loss occurred at Credit Suisse, which reportedly agreed to settle a class action brought by investors in four pension funds for $15.5 million.

The dispute concerned two writedowns Credit Suisse made in early 2016 totalling $633 million on a series of collateralised loan obligations and distressed debt products. As a result of the writedowns, the value of Credit Suisse American depositary receipts (ADRs) fell by 11%, wiping out approximately $23 million in the bank’s market capitalisation. This led to significant losses on investors’ holdings of Credit Suisse ADRs.

The pension funds alleged Credit Suisse had misled its shareholders prior to the writedowns, by stating in filings that it maintained “comprehensive risk management processes and sophisticated control systems governing its investment operations”. The complainants alleged that on at least three occasions, Credit Suisse had “redefined”, “increased” or “retired and replaced” its risk limits to accommodate its growing risk exposure.

As of July 2020, the settlement required the approval of a federal court in Manhattan.

In July’s fifth-largest loss, UBS paid out $10 million over allegations it had circumvented the priority given to retail investors in certain municipal bond offerings between August 2012 and June 2016.

According to the SEC, UBS representatives improperly allocated bonds on multiple orders intended for retail customers to parties known in the industry as ‘flippers’, who then immediately resold or ‘flipped’ the bonds to other broker-dealers at a profit. Executives often included false zip codes with the retail orders that were not associated with the relevant accounts, in order to meet the issuers’ definition of retail priority.

As a result of this scheme, UBS made a total profit of $5.2 million, by circumventing the priority of orders and giving UBS improper access to a higher priority in the bond allocation process.

The SEC’s order imposed a $1.75 million civil penalty, $6.8 million in disgorgement of ill-gotten gains and $1.5 million in pre-judgement interest.

 

Spotlight: Aussie banks pay for underpaying staff

Over the course of 2019 and 2020, two of Australia’s big four banks, Westpac and Commonwealth Bank of Australia (CBA), began the process of repaying thousands of staff who had been underpaid as a result of systems errors at both institutions.

Westpac announced in July 2020 that it would pay back A$8 million (US$5.7 million) to around 8,000 employees. Westpac said that it did not apply the correct methodology for determining long-service leave entitlements where staff had changed their working arrangements, such as moving from part-time to full-time. Westpac also said that, for long-service leave entitlements, different rules applied to different employees based on their employment history and working arrangements. As a result of the error, some Westpac staff were overpaid, but these employees would not be asked to repay any money.

CBA announced in April 2019 that it expected to pay A$15 million to current and former employees. CBA reportedly underpaid its staff and that of its Bankwest subsidiary due to errors in its systems, including payroll and other human resources systems. This also resulted in problems calculating leave, superannuation and redundancy entitlements. Some of the problems reportedly dated back 10 years.

In December 2019, it was reported that CBA had widened the investigation into the issues, examining the records of 250,000 current and former staff, going back as far as 2002. The average reimbursement was reportedly about A$220 per employee. Due to the wider investigation, CBA was reportedly expected to pay A$53.1 million in repayments to employees.

In focus: Collateral mismanagement

Over the last few years, risk managers have increasingly placed risks such as cyber, information security and third-party at the top of their rankings for current and emerging risks. But even as new threats emerge, traditional operational risks remain – and many reared their head again during Covid-enforced lockdown. Perhaps the costliest was trade and commodity finance fraud based on fraudulently documented or fictitious collateral.

Perhaps the most high-profile example was the $3.8 billion default of Singaporean oil trader Hin Leong, on the back of plummeting oil prices in April. Debts running to billions of dollars to a group of lenders including ABN Amro, HSBC and Societe Generale were reportedly backed by assets that turned out to be either pledged multiple times over or non-existent, according to press reports, making it unclear how much lenders will be able to recoup.

There are plenty of similar stories among Asia-Pacific lenders in the commodities sector. For example, in June 2020, 14 Chinese financial institutions were reported to have lost a total of 18.4 billion yuan ($2.6 billion) after a gold processor, Wuhan-based Kingold Jewelry, used fake gold bars as collateral on the loans.

But the trail runs much deeper. Since 2012, ORX News has recorded 120 stories in its database that have been tagged with the collateral management process. These stories mostly involve lenders losing millions of dollars on loans that were backed by fraudulent or understated collateral.

Stories based around collateral management and fraud tend to have a low frequency but a high severity, with total losses often in the hundreds of millions or even billions. Often, these events cause losses at several firms.

In the example of Kingold, it was not until after the company had begun defaulting on its loans that the lenders tested the gold bars, and found that they were made of a gilded copper alloy.

In another example, Natixis was defrauded of $31.8 million in 2017 after a company used fake receipts for nickel stored in warehouses as collateral for the loan, but only found out that the receipts were fraudulent many months after the loan had been issued.

Due diligence is the first step to preventing such losses. Collateral management agreements (CMAs) are becoming an important risk mitigant, especially in the days of Basel III regulations. In this model, a collateral manager will take control of the physical commodity until the trade is completed and the loan paid off in full.

CMAs alone are not enough, however. Instead, banks and lenders need to better understand the processes involved in collateral management, say op risk practitioners. Lenders and other financial institutions are routinely said to be over-reliant on promissory notes and the like when determining the value of collateral used against loans, rather than on the physical processes of verifying and confirming collateral.

Editing by Tom Osborn

All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX, and we have not confirmed any of the information shown with any member of ORX.

While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.

Covid forces banks to focus on vendor risk

By Steve Marlin | News | 6 August 2020

Firms eyeing financial well-being of critical suppliers, and seeking alternatives

The economic fallout from the Covid-19 pandemic has spurred banks to increase due diligence of vendors that provide critical outsourced services – while at the same time making audits of such firms difficult, and on-site inspections impossible.

Companies are scrutinising everything from the financial well-being of their third-party suppliers to their ability to switch to other providers, should their primary ones fail. This is all the more important in an environment where on-site inspections are non-existent.

“For a lot of third parties, working from home is a foreign concept,” said Nasser Fattah, managing director of cyber security, IT and third-party risk at MUFG, during a Risk.net webinar on August 5. “Third parties [would] have a significant impact if they were to experience a breach, so you need to have a dialogue with critical vendors.”

Amid instances of outsourced service providers buckling under demand during the pandemic, financial firms are closely monitoring the ability of vendors to weather the economic downturn and maintain critical operations. In some cases, banks say they’ve stepped in to support vendors in obtaining government permission to continue operations in-office, with key worker designations.

Banks are also scrutinising the controls that vendors maintain for managing their data – and who has access to it – in a critical light because of the threat of external parties or inside employees gaining access to sensitive data. The possibility of unauthorised access to networks is creating the need for additional levels of scrutiny of a vendor’s policies for managing employees working remotely.

“You need to have good working relationships with third parties and minimise ways they access your environment. Insider threats from a third party [are] hard [to manage], so you need awareness and education,” said Jörgen Mellberg, chief information security officer and head of IT at Sparbanken Syd in Sweden, during the webinar.

With on-site inspections at times impossible during lockdown, companies are paying closer attention to information regarding the financial health of suppliers. For public companies, they are poring over audited statements to determine their credit standing, sources of liquidity and available capital.

Third parties [would] have a significant impact if they were to experience a breach, so you need to have a dialogue with critical vendors

Nasser Fattah, MUFG

It’s also important to determine the degree to which third parties are dependent on or owned by other suppliers, such as the big tech firms, banks point out.

“[Relationships with] Microsoft and Apple could be 80% of a small [vendor’s business]. Concentration raises the risk level. Go through the financial statements and try to determine if they have concentrations within their own business,” said Joseph Iraci, head of financial risk management at TD Ameritrade, during an earlier webinar on August 3.

Given the possibility of a critical supplier becoming unavailable or insolvent, having a replacement strategy is crucial. Firms are assessing the criticality of the products and services they source externally with a view towards either switching to another provider or bringing services in-house.

With the difficulty of conducting financial assessments of vendors that are privately held, and the inability to do on-site inspections, firms need to understand the risks inherent from an incomplete audit and be prepared to apply additional scrutiny once they’re able to get back on-site, operational risk executives say. In some cases, companies are making do with minimal audits, sometimes as simple as running a webcam through data centres.

Firms should be prepared to help their critical suppliers get over the hump, such as sharing their security awareness programmes to educate third-party personnel on best practices for controlling cyber risk, said MUFG’s Fattah.

“If you have an effective security awareness programme, you should consider extending it to your third party, because their success is your success,” he added.