Top Australian lender Commonwealth Bank saw its operational risk-weighted assets (RWAs) fall by more than A$6.2 billion ($4.8 billion) over the fourth quarter of 2020 after its regulator cut in half a capital add-on imposed as punishment for past conduct failures.
The firm’s op RWAs stood just shy of A$50 billion at end-December, down 11% on three months prior and 16% on a year ago. On November 20, the Australian Prudential Regulation Authority (Apra), the bank’s watchdog, reduced its A$12.5 billion op RWA add-on, first applied in May 2018, by 50% “in response to the bank’s progress in addressing concerns over its governance, accountability and risk culture frameworks and practices”.
The reduction in op RWAs cancelled out a A$5.1 billion increase in credit RWAs over Q4 and a A$661 million hike in interest rate risk in the banking book. Market RWAs edged A$1.4 billion higher over the quarter.
Total RWAs for end-December stood at A$453.6 billion, essentially flat on three months prior and about 1% higher on end-2019. Commonwealth Bank’s Common Equity Tier 1 (CET1) capital ratio ended the year at 12.6% on an Apra basis. Of the 100-basis-point improvement to the ratio disclosed between June and December, Commonwealth Bank said 17bp was attributable to the halving of the capital add-on.
Apra launched an inquiry into the Commonwealth Bank in August 2017, following several incidents that damaged the reputation and public standing of the bank.
The resulting report found that “CBA’s continued financial success dulled the institution’s senses to signals that might have otherwise alerted the board and senior executives to a deterioration in CBA’s risk profile”. It concluded “this dulling was particularly apparent in CBA’s management of non-financial risks; ie, its operational, compliance and conduct risks” – hence the add-on to its regulatory capital.
Losing half the RWAs associated with the op risk add-on came at a fortuitous time for Commonwealth Bank. The reprieve effectively offset more than two-thirds of the RWA inflation related to credit, market and interest rate risks over the year, keeping its CET1 ratio from degrading.
Apra said the remaining half of the add-on will stay in place until the bank fulfils all the requirements included in the remedial action plan it was handed in 2018. It took close to two years for the regulator to consider the bank half-way done, so it may be some time before that additional A$6.25 billion of RWAs is abolished.
Sign up to the Risk Quantum daily newsletter to receive the latest data insights.
Also: Wells Fargo fee fight; Nasdaq queers Emir reg; Covid keeps AML fines in line. Data by ORX News
Deutsche Bank will pay $123 million to settle a Department of Justice (DoJ) investigation into corrupt payments and bribes – actions that notoriously included cartoon caper-style transfers of cash in manila envelopes – in a co-ordinated resolution with the Securities and Exchange Commission (SEC). The fine represents January’s largest loss.
Between 2009 and at least 2016, the bank falsely concealed bribes that had been paid to clients’ decision-makers to gain and retain business worldwide, according to the DoJ. For payments of $7 million in bribes, the firm made $35 million in profits, it said. The bank concealed the bribes by recording payments to contracted ‘business development consultants’ as referral fees and consultancy payments – and failed to implement and maintain internal accounting controls, which would have detected such behaviour.
Deutsche also “knowingly and wilfully” conspired to conceal these payments through shoddy accounting that was shared with senior management, the SEC found. Although the bank had identified such failures through internal audits in 2009 and 2011, it failed to remediate them until 2016.
The US Postal Inspection Service and the Federal Bureau of Investigation also played a part in uncovering these breaches of the US’s Foreign Corrupt Practices Act (FCPA).
Deutsche entered into a three-year deferred prosecution agreement and agreed to pay $80 million for the FCPA violations and $43 million in criminal disgorgement and victim compensation to the SEC.
The DoJ further announced that Deutsche would pay a separate fine of $7.5 million to settle a commodities fraud case, which found that traders on its precious metals desk in New York, Singapore and London had spoofed the metals markets.
In January’s second-largest loss, Wells Fargo agreed to pay up to $40 million to settle class action claims that it misled small businesses and overcharged them payment processing fees. The suit, filed in August 2017, was brought by six small business merchants that had retained Wells Fargo Merchant Services to process credit and debit card payments. WFMS failed to properly disclose the true rates and charges that would apply, they alleged, then breached their contracts by increasing rates and fees. The bank also imposed improper or inadequately disclosed card processing fees and charges, making illicit profits of $200 million a year as a result, the suit alleges.
The bank crammed merchants with unanticipated fees once they were “locked-in” and buried absurd and unfair exculpatory provisions in 50 pages of fine print and legalese, which it knew would neither be read in its entirety nor reasonably understood by customers, said the plaintiffs. The final value of the settlement is dependent on the number of valid claims received.
In the month’s third-largest loss, Nasdaq Clearing was fined $36 million by Swedish regulator Finansinspektionen (FI) over what it found were serious breaches of the European Market Infrastructure Regulation (Emir).
FI began its investigation into Nasdaq Clearing in 2018 after private Norwegian trader Einar Aas, a self-clearing member of the exchange, took a spread position on Nordic and German electricity futures, betting the basis between the two markets would tighten. Instead, it blew out – leaving Aas with huge mark-to-market losses and unable to meet an intraday margin call. He was declared in default on September 11, 2018.
The clearing house failed to find a buyer at an acceptable price for the portfolio at a first, hastily arranged auction held that day. At a second auction on September 12, it was thought the portfolio had been sold – public statements at the time said it had been “closed out”. But subsequent reporting by Risk.net confirmed that Nasdaq had instead merely purchased hedges for the defaulted positions and had retained the portfolio for more than nine months.
FI found Nasdaq to be in violation of the investment prohibition in Emir by investing its own funds in derivative contracts too long after a default event. It also asserted that, because of functional differences in the underlying products used as hedges and cashflow dates not matching Aas’s trades, Nasdaq and its members were exposed to residual risk – in other words, it had failed to meet its primary objective of running a matched book. FI further found obvious and significant deficiencies in how Nasdaq had designed its participant requirements and monitored fulfilment of these requirements.
In the fourth-largest loss of the month, UK-based money transfer business MT Global landed a record fine of £23.8 million ($32 million) from Her Majesty’s Revenue and Customs (HMRC), the UK tax authority, for anti-money laundering and record-keeping failures.
The fine, which is reportedly the largest ever handed out by HMRC, found the firm had breached a total of seven regulations by failing to carry out risk assessments or have the correct policies, controls and procedures in place, to conduct due diligence or to carry out adequate record-keeping.
Capital One suffered January’s fifth-largest loss when it agreed to pay a $13 million settlement to customers, who alleged in a class action they had been unfairly charged fees to make balance inquiries and that Capital One had misrepresented its fee practices.
The suit, launched in April 2018, said the bank had charged customers for balance inquiries at its ATMs and two separate fees for making balance inquiries and withdrawing cash from out-of-network ATMs – on top of the ATMs’ own fees – in breach of its Electronic Funds Transfer Agreement.
In January, the China Banking and Insurance Regulatory Commission (CBIRC) said it had fined four institutions a total of 199 million yuan ($30.8 million) for regulatory violations in wealth management products (WMP) and small business loans. It fined China Development Bank (CDB), the Industrial and Commercial Bank of China (ICBC), China Great Wall Asset Management (GWAM) and the Postal Savings Bank of China (PSBC) in the action.
CDB’s $7.7 million fine was for financing illegal government purchases of service projects and illegal collection of loan commitment fees for small and micro enterprises. Its leasing unit was also fined 1 million yuan for transferring non-performing assets off its balance sheet.
ICBC was fined $8.4 million for investing in its own wealth management funds and other banks’ credit and non-standard assets and for inadequate information disclosures of its WMPs. GWAM’s $7.6 million fine was for providing illegal guarantees, inflating profits and overpaying performance bonuses to executives. Two of its subsidiaries were also fined for illegally setting up subsidiaries and mortgages – and for illegally accepting its own equity as collateral to provide financing to shareholders. And PSBC was fined $7 million for providing guarantees on wealth management products sold by some of its branches.
The string of fines follows the trend for regulators in Asia-Pacific to impose increasingly hefty fines for wrongdoing.
Few could have predicted that Covid-19 would prefigure a downward trend in regulatory fines or that a resulting fall in cash usage would stymie money launderers, curtailing op risk losses around the globe.
But global anti-money laundering (AML) fines fell 64% from 2018’s $2.67 billion to only $955 million in 2020. Frequency of AML and know-your-client (KYC) fines also fell sharply over this period – from 43 to just 20, spurred perhaps by shifting regulatory priorities during the pandemic.
And although 2020 saw one more AML fine of $1 million+ than 2019 did, severity of fines fell by 41% over the year.
Surprisingly, perhaps, western Europe issued 2020’s greatest number of AML fines – in both frequency and severity – the largest of which recorded by ORX News was Skr4 billion ($397 million) for AML failures in Swedbank’s Baltic operations. The Swedish regulator, Finansinspektionen (FI), concluded that Swedbank had not efficiently addressed the risk of money laundering in the region, despite several internal and external reports warning of the deficiencies of its Baltic subsidiaries.
FI handed out another chunky AML fine of Skr1 billion ($107 million), this time for failures in SEB’s Baltic operations, while its Estonian subsidiary, SEB Pank, was fined €1million ($1.19 million) for its own AML failures by Estonian regulator Finantsinspektsioon (FSA). Two years earlier, Danske Bank was ordered to set aside $1.5 billion in capital to cover potential AML violations in its own Baltic operations.
In North America’s highest fine, the New York State Department of Financial Services (DFS), fined Deutsche Bank $150 million over its relationships with Danske Bank Estonia, with FBME Bank and with Jeffrey Epstein, the convicted sex offender. The DFS accused Deutsche of ignoring numerous red flags and both internal and external warnings in conducting these relationships.
Among only three AML fines in North America, the region had the highest average fine at $89 million, versus the western European average of $73 million. America’s relatively low number of AML fines could be because of the progress of draft legislation through Congress. The Illicit Cash Act is designed to toughen AML laws by forcing US financial institutions to share information on shell companies and to help authorities trace beneficial ownership. It also seeks to push banks to update their AML systems to make greater use of machine learning technology.
Banks have cautiously welcomed tighter legislation, but the implementation of stricter laws brings risks associated with changing controls. And while it could lay the groundwork for increased scrutiny of AML controls and suspicious transactions, this remains to be seen.
Indeed, the first few weeks of 2021 have already witnessed two large AML fines. As reported above, the UK tax authority issued its largest ever fine, while in the US, the Federal Deposit Insurance Corporation (FDIC) fined Apple Bank for Savings $12.5 million for violating the federal Bank Secrecy Act and failing to fully institute a promised AML compliance programme.
With tightening regulations on the agenda, financial institutions can expect their AML and KYC processes to receive greater scrutiny.
Editing by Louise Marshall
All information included in this report and held in ORX News comes from public sources only. It does not include any information from other services run by ORX, and we have not confirmed any of the information shown with any member of ORX.
While ORX endeavours to provide accurate, complete and up-to-date information, ORX makes no representation as to the accuracy, reliability or completeness of this information.
Internal audit is a corporate governance mechanism that is thought to play a particularly important role in goal achievement and the optimal use of an organization’s resources (Alqudah et al 2019; Turetken et al 2019; Narayanaswamy et al 2018; Steinbart et al 2018). For this reason, internal auditors are trained to acquire specialized knowledge in new techniques in order to monitor, and ensure the effectiveness of, internal control and risk management systems (Alias et al 2019; Sy and Tinker 2019; Mihret and Grant 2017; Coetzee 2016).
The importance of internal audit becomes more apparent when we consider the financial scandals uncovered in recent years (Oussii and Klibi 2019; Musallam 2018; Sorensen and Miller 2017; Endaya and Hanefah 2016). These scandals led to an increasing focus on an effective internal audit function (IAF) to ensure the reliability and integrity of financial information by achieving maximum quality in the financial reporting process (Botha and Wilkinson 2019; Hazami-Ammar 2019; Repousis et al 2019; Oussii and Boulila Taktak 2018). To this end, useful financial information is conveyed to investors and business stakeholders, while enterprise risk management (ERM), which includes the management of multiple risks faced by a firm in the business world, is ensured (Alzoubi 2019; Mardessi and Ben Arab 2018).
As more and more risks emerge, risk-based internal audit (RBIA) is increasingly adopted by firms (Shin et al 2013; Castanheira et al 2010; Koutoupis and Tsamis 2009). Operational risk management is now incorporated into the internal audit department of firms and organizations (Coetzee and Lubbe 2014). Internal audit is expected to identify and mitigate the risks that business entities face on a daily basis, while the identification and analysis of the factors influencing the implementation of RBIA is also of great importance (Zainal Abidin 2017; Castanheira et al 2010).
The literature has so far identified certain variables related to the implementation of RBIA, such as the risk management system or the “review concern” that characterizes the audit committee. Previous research has focused primarily on audit committee characteristics and the company’s risk management or internal control system, relying on data obtained from internal auditors and business managers or financial statements (Zainal Abidin 2017; Castanheira et al 2010). However, the risks surrounding audit issues have become more complicated due to technological complexity (Lois et al 2019). This paper investigates additional factors related to internal auditor characteristics, internal audit quality and international internal audit standards, with data derived from internal auditors and managers or business directors as well as external auditors.
The aim of this study is to expand the critical variables used in the implementation of RBIA within Greek companies, as well as to examine the relationship between these variables and RBIA implementation. Our research includes the statistical analysis of questionnaires completed by internal and external auditors, as well as directors, managers and employees of Greek firms, regarding the implementation of RBIA. The variables analyzed in this model are internal auditor characteristics, internal audit quality, the need for review, the risk management system and the degree of compliance with international internal audit standards.
The analysis provides a set of internal audit variables that companies need to focus on in order to handle the plethora of operational risks they face daily, ensuring that their objectives are met and that procedures within the firm are properly executed. At the same time, certain factors emerge that should be taken into account by any management wishing to implement RBIA, in order to achieve a thorough analysis of the company’s data and reports, mitigate risks and ensure financial reporting quality, thus eliminating misrepresentation or falsification of financial statements.
The novelty of the present paper is threefold. First, our results highlight the factors associated with the implementation of RBIA, helping business executives to draw attention to these important factors in the proposed model. Second, the analysis includes additional factors that have not been investigated in similar studies and are related to internal auditors, internal audit quality and the implementation of international internal audit standards. Third, the Greek economy that emerged from the recent economic crisis is undoubtedly a particularly interesting area of research for the international literature that has not been explored extensively (Dimitropoulos et al 2019; Zoega 2019; Koutoupis et al 2019; Mertzanis et al 2019; Williams and Vorley 2015). Consequently, through the present analysis, important statistical conclusions are drawn regarding the companies and organizations in an economy that has just come out of an economic crisis, as in the case of Greece.
The outline of this paper is as follows. The relevant literature examining the implementation of RBIA and its variables is reviewed in Section 2, in order to better clarify the research hypotheses of the model. The methodology of the research model and the research results are presented in Sections 3 and 4, respectively. The study’s results are discussed in Section 5. Conclusions, a discussion of the limitations of the study, and some suggestions for future research are presented in Section 6.
The main role of an internal audit is enterprise risk management (ERM), so as to ensure that engagements are performed more effectively and efficiently and that the available resources are optimally used. According to Coetzee and Lubbe (2014), there are four key steps to achieving an effective and efficient internal audit process: the determination of strategic objectives; (on the auditors’ part) the identification of potential risks that may threaten the achievement of these strategic objectives; the assessment of risks, including the likelihood of any event as well as its potential impact; and the regular monitoring of risks and their elimination. Risk assessment procedures are now in place in most companies. In Italy, 67% of the companies involved in a 2003 survey had followed risk assessment procedures and applied the Committee of Sponsoring Organizations of the Treadway Commission model, mainly at a macro level, so as to plan their annual schedule of audits (Allergini and D’Onza 2003).
The internal audit should focus on the risks faced by companies in terms of compliance with the legal framework and regulations. As far as financial institutions are concerned, the most effective approach for conducting internal audits involves the implementation of an RBIA. The RBIA continuously monitors and evaluates all enterprise risks, focusing on the likelihood of occurrence and the potential outcome. In order to accomplish the audit, audit teams are required to fully comprehend an entity’s objectives and strategies. Next, the teams identify and analyze internal and external control environments in order to assess enterprise risks, and ultimately amend them (Koutoupis and Tsamis 2009). The implementation of a risk management model when conducting an internal audit depends mainly on whether the company is active internationally or it is listed on the market, since such entities are exposed to more risks; private sector firms are more likely to implement risk management systems than public sector firms (Castanheira et al 2010).
Regarding the relationship between audit mechanisms and the implementation of RBIA, the audit committee should have an active role in providing an integrated perspective on the audit process. The committee should also monitor the use of risk management systems. The aforementioned relationship and monitoring constitute factors that determine the implementation of RBIA procedures (Zainal Abidin 2017), while the internal audit, its added value and the internal auditors play a significant role in increasing the effectiveness of risk management practices (Drogalas and Siopi 2017).
Internal auditor characteristics factors are related to internal auditor performance and influence the implementation of the internal audit. Internal auditors should acquire both general and specialized audit knowledge of the business environment in which they operate and of the risks they face (Endaya and Hanefah 2016; Sarens et al 2009; Petridis et al 2019). Elements that increase added value and enhance internal auditing include experience in economic and financial matters. Further, the auditors’ involvement in business processes and perseverance increase the importance of their characteristics. Similarly, an auditor will need to cooperate and communicate with other employees and auditors in order to enhance the efficiency and effectiveness of the internal auditing. In addition, the involvement of internal auditors in self-assessment and risk management processes is directly related to the internal audit performance (Alqudah et al 2019; Arena and Azzone 2009). Finally, internal auditor independence is essential in order for them to perform all audits efficiently and effectively and to lay the foundations for the proper management and resolution of enterprise risks. This ensures the understanding and management of the entity’s enterprise risks as well as the effective contribution of internal auditors to financial decisions.
In view of the above, the following hypothesis is developed.
"(H) There is a positive correlation between internal auditor characteristics and the implementation of an RBIA."
Internal audit quality is a determining element in the effectiveness of an internal audit within a company. The factors impacting it include the expertise of auditors, the scope of services provided and the proper planning, execution and communication of audits performed. Also, a good quality internal audit enables auditors to provide useful recommendations and findings, which are of high importance for management decisions and risk management (Mihret and Yismaw 2007). Hanskamp-Sebregts et al (2019) argue that internal audits give insight into problems with their quality and allow improvements.
There are certain issues we must consider to measure the quality of the internal audit. First, an entity should determine its annual audit plan, and reconstruct it if it is not found to be appropriate. Due to their relevance in the audit process, the importance of each department audited and the way internal auditors communicate with other participants should also be investigated to support any improvements (Alzeban and Gwilliam 2014; Cohen and Sayag 2010). Quality is determined by several interconnected factors, including the accomplishment of internal audit objectives, the efficiency of the auditors’ work, the communication between internal and external auditors, the significance of the internal auditors’ recommendations, and the appropriate justification of audit findings. These factors are indirectly related to the way an entity manages risk (Lois et al 2019). In order to achieve quality during internal audits, internal auditors must not only provide financial advice, but also be objective and able to adapt to comply with statutory requirements, so as to accomplish the firm’s objectives and contribute to the improvement of the firm’s risk management.
The abovementioned studies examine internal audit quality as a determining factor for internal audit effectiveness. However, taking into account the fluctuating business environment and the various risks, as well as the direct link between internal auditing and risk management, the second research hypothesis is developed as follows.
"(H) There is a positive correlation between internal audit quality and the implementation of an RBIA."
The purpose of the audit committee is to provide continuous monitoring of developments related to the audit department in order to keep track of the progress of management’s actions. One significant feature of the audit committee is the responsibility for the review process, that is, the diligent and proactive attitude of auditors in giving opinions and views on the audit process. An audit committee characterized by a responsible attitude to the review process periodically monitors, reviews and approves internal audit plans and related activities, while considering information pertaining to the business risks faced by the firm (Zainal Abidin 2017). The effectiveness of the audit committee is related to its ability to monitor, based on data and information provided by the internal audit department. Conversely, this relationship is highlighted by the audit committee’s significant influence on management (Oussii and Boulila Taktak 2018; Turley and Zaman 2007).
However, the business of the review also pertains to internal auditors, who must monitor the risks to which the entity is exposed, as they are often asked to explain and argue about risk management, the necessary audits and ways to mitigate risks. Therefore, it is important to have a review and feedback process, which is ensured by the implementation of ongoing internal audits and the existence of periodic reports. Through this process, internal auditors can secure the correct flow and execution of management actions, and identify shortcomings in actions that should have been taken (Mat Zain and Subramaniam 2007).
The above analysis highlights the fact that the need for internal auditors and the audit committee to review is related to the risk management that each firm must perform.
Therefore, we postulate the following.
"(H) There is a positive correlation between the responsibility for review and the implementation of an RBIA."
Enterprise risk management (ERM) is directly linked to the achievement of the strategic objectives of any entity or organization, and aims to ensure the existence of appropriate controls and procedures so as to improve decision-making and therefore enhance the performance of all stakeholders (Cruz 2006, 2014). Therefore, it is necessary to have a risk management system focusing on all risks that may affect overall company performance (Woods 2007; Goodwin-Stewart and Kent 2006). Failure or partial implementation of a risk management system may lead to poor identification of the risks the company needs to control, transferring responsibility to the audit department. This escalates the workload of the internal auditors, since their role now includes risk management, raising the potential for conflicts of interest between auditors and executives. Note that risk management systems mostly tend to be adopted by financial institutions and large enterprises. On the other hand, ERM is mostly applied in RBIA cases and internal audit’s role in the establishment of risk management or analysis (Mardessi and Ben Arab 2018; Drogalas et al 2020).
Based on the above analysis, we introduce the following hypothesis.
"(H) There is a positive correlation between the risk management system and the implementation of an RBIA."
International internal audit standards aim to educate individuals about the role and responsibilities that exist in the administration of internal audit and provide the basis for measuring the performance of internal audit and improving its implementation. However, due to the complexity of the economic environment and the variety of companies in terms of size, purpose and structure, the way an internal audit is performed and the degree of compliance with international internal audit standards varies from enterprise to enterprise. Noncompliance with international internal audit standards can be traced back to a lack of perception of their added value by management. The time needed for auditing, combined with inadequate training of staff, further hinders the implementation of the standards. In addition, some companies consider the audit standards to be complex and unsuitable for small entities, or unrepresentative of certain sectors (Burnaby et al 2009). However, in compliance with international internal audit standards, the formal documentation of a company’s purpose and staff responsibilities, consistent with the International Standards for the Professional Practice of Internal Auditing (ISPPIA), is vital for a higher quality of financial reporting (Alzeban 2019). Finally, internal auditors may find it useful to have formal certifications that demonstrate they have understood the standards and possess the necessary knowledge to perform internal audits (Sadler et al 2008).
In view of the above analysis, and in conjunction with the environment in which companies operate, which involves many business risks that need to be addressed, the following hypothesis is developed.
"(H) There is a positive correlation between the degree of compliance with international internal audit standards and the implementation of an RBIA."
Primary data was collected using a questionnaire distributed to internal auditors, external auditors, directors and managers of Greek companies. A total of 105 responses were ultimately received, resulting in a response rate of 43%. The data collection was carried out in 2019.
The questionnaire was based on a review of the relevant literature. The questions were initially discussed with four knowledgeable internal auditors. The questionnaire presented affirmative statements, and respondents were asked to indicate their level of agreement on a five-point Likert-type scale (Holt 2014).
The questionnaire was divided into seven parts. The first one investigated general information about the respondents and companies. The second one investigated the degree of implementation of RBIA. The other five parts related to five independent variables: the characteristics of internal auditors; the quality of internal audit; the review process; the risk management system; and the compliance with ISPPIA.
The initial analysis was based on the mean scores of all the respondents as a result of using the five-point Likert scale. Consequently, the study continued with a reliability and regression analysis in order to arrive at and test the research hypotheses.
Based on our analysis of the previous studies, one dependent variable and five independent variables were used. The dependent variable of our model, “implementation of RBIA”, was assessed by using seven factors: understanding of objectives; analysis of business environment; business risk assessment; risk assessment system use; risk management system evaluation; risk management system reliability; and internal control system evaluation.
The first independent variable, “internal auditor characteristics”, is assessed by five criteria: general knowledge; auditing knowledge; cooperation with external auditors; experience in accounting and auditing; and willingness to find solutions.
Risk (dependent variable)
Implementation of an RBIA
Zainal Abidin (2017); Coetzee and Lubbe (2014); Koutoupis and Tsamis (2009); Allergini and D’Onza (2003)
Internal auditor suited characteristics
Alqudah et al (2019); Endaya and Hanefah (2016); Sarens et al (2009); Arena and Azzone (2009)
Internal audit quality
Lois et al (2019); Alzeban and Gwilliam (2014); Cohen and Sayag (2010); Mihret and Yismaw (2007)
Concern expressed by stakeholders and users of the review
Oussii and Boulila Taktak (2018); Zainal Abidin (2017); Turley and Zaman (2007); Mat Zain and Subramaniam (2007)
Risk management system
Mardessi and Ben Arab (2018); Drogalas et al (2020); Woods (2007); Goodwin-Stewart and Kent (2006)
Compliance with international internal audit standards
Alzeban (2019); Burnaby et al (2009); Sadler et al (2008)
“Internal audit quality” is the second independent variable in our model and is measured by five criteria: usefulness of suggestions; internal auditor efficiency; communication with external auditors and employees; and compliance with the legal framework.
“Review concern” is the third independent variable. This independent variable is assessed by five criteria: provision of opinions on audit procedures; carrying out continuous internal audits; risk monitoring and expression of concerns; identification of shortcomings; and awareness of business development.
The fourth independent variable of our analysis is the “risk management system”, which is measured by four criteria: risk management procedures; the internal auditor’s role in the risk management system; implementation of proposals by the risk manager; and the risk management committee.
Finally, “compliance with international internal audit standards” is the fifth independent variable. This is assessed by four criteria: implementation of ISPPIA; management support regarding ISPPIA; existence of official documents complying with ISPPIA; and the internal auditor’s certifications. The formulation of the hypotheses is given in Table 1.
The aim of the methodology presented is to investigate the effect of the variables on the RBIA. To investigate the latent factors, Chronbach’s -test is applied (Vaske et al 2017).
|2.1–2.7||Implementation of RBIA (Risk)||0.939|
|3.1–3.5||Internal auditor characteristics||0.794|
|4.1–4.5||Internal audit quality (IntQual)||0.851|
|5.1–5.5||Review concern (RevCon)||0.882|
|6.1–6.4||Risk management system (RiskMgtSys)||0.873|
|7.1–7.4||Compliance with international internal||0.911|
|audit standards (CompInternAudStand)|
The results of the study are shown in Table 2. The latent factors, as formulated by the relevant literature, are well selected, since Chronbach’s -score is greater than 0.7.
The corresponding latent variables are calculated based on the average values of the corresponding Likert-type responses. Therefore, the latent variable “risk” is calculated as follows:
The majority of the respondents (47.62%) were internal auditors, while 20.95% were external auditors. The results also reveal that most of the respondents were working in the private sector. Finally, as far as internationalization is concerned, 54.29% of the companies the respondents were working in were collaborating on a global scale. The demographics statistics are presented in Table 3.
In Figure 1, the analysis of the “risk” latent variable is analyzed as per nominal variables position and sector. The internal auditors have a higher median value of risk than the external auditors (Figure 1(a)). The lowest median value is reported for managers (almost 3). According to Figure 1(b), there is higher risk in the private sector than in the public sector. The results reflect the real situation.
The descriptive statistics of the latent variables are shown in Table 4. The last column shows the coefficient of variation (CV). For each variable, %, indicating that the observations are dispersed.
Several tests should be conducted in advance to guarantee the correctness of the application. More specifically, the following tests were conducted:
correlation between the independent variables;
Kolmogorov–Smirnov (KS) test for residual normality.
In this section the correlation analysis is presented. One of the main problems to take into consideration in a regression analysis is the independent variables’ high degree of (negative or positive) correlation:
The correlation index applied in this analysis is the Pearson correlation index presented in (4.1) (Zhou et al 2016).
In Table 5, weak positive correlations are reported between the independent variables. All correlations are statistically significant at 1%. The correlation score, , as calculated in Table 5, indicates a low correlation, which allows the regression analysis to be applied.
One of the most common phenomenons in regression analysis is collinearity, where one variable is linearly predicted from the others with a substantial degree of accuracy (Winship and Western 2016). Collinearity is expressed mathematically by a linear association between two variables with the following formula:
Table 6 shows multicollinearity results and, in particular, tolerance and variance inflation index (VIF). Values of in the range of and/or values of in the range indicate multicollinearity. As can be seen in Table 6, no indication of multicollinearity is reported based on tolerance and VIF results.
Another prerequisite test for the application of linear regression is the KS test for residual normality.
To test the normality of the variables, the following hypotheses are formulated.
"(H) The sample data is not significantly different than that of a normal population."
"(H) The sample data is significantly different than that of a normal population."
|Most extreme differences||Absolute||0.063|
|Asymp. sig. (two-tailed)||0.804|
From the results of the KS test presented in Table 7 it can be seen that the corresponding -value equals 0.642, indicating that the null hypothesis presented above is accepted.
Therefore, we conclude that the variables of the study are normally distributed. The same conclusion can be drawn from a Q–Q plot of unstandardized residuals (Figure 2). The observations are fitted very well by the line.
In this section the regression model will be constructed and validated. In Table 2, the variables of the study (dependent and independent) are formulated as latent variables for the corresponding questions of the questionnaire. The resulting regression model is the following:
The regression estimates calculated using the ordinary least squares (OLS) method are presented in Table 8. It can be seen that all the independent variables have a positive effect on “risk”. The greatest effect on “risk” is associated with the review concern (RevCon) variable. All the variables are statistically significant except for IntAudChar.
Based on the results of Table 8, the hypotheses as formulated in the literature review are presented in Figure 3. With the exception of hypothesis , all the variables in the study have a positive correlation with the dependent variable.
According to the answers to the questionnaire, in most Greek businesses, the internal audit is very important in understanding business objectives and assessing the risks to which the entity is exposed. Further, Greek internal auditors are experienced people, characterized by specialist auditing knowledge, providing useful suggestions to the management.
Results regarding the variables of implementation of risk-based internal auditing appear to confirm the hypotheses formulated by the literature in all cases except for the characteristics of internal auditors. The regression analysis demonstrates a positive relationship between all factors and the implementation of risk-based auditing.
According to the regression analysis results, the relationship between the implementation of RBIA and the characteristics of internal auditors is shown to be borderline positive but not statistically significant. Consequently, hypothesis is not supported. On the contrary, the quality of the internal audit is an important factor affecting the implementation of RBIA. The statistically significant positive relationship leads to the acceptance of hypothesis .
In addition, similarly to Zainal Abidin (2017), review concern and the risk management system seem to be positively and significantly associated with the implementation of RBIA; as a result, hypotheses and are supported. Finally, regarding the compliance with ISPPIA, it can be argued that this is another important factor, due to the positive and significant relationship with the implementation of RBIA. The results lead to the acceptance of hypothesis .
Nowadays, internal audit plays a major role for businesses, due to the various risks that exist in the economic environment and the financial scandals that have been uncovered. Consequently, more and more firms are adopting RBIA, trying to identify and mitigate the risks they are exposed to, in order to achieve their goals. In our study, we provide an empirical analysis of factors associated with the implementation of risk-based internal auditing in Greek firms.
Greek businesses implement risk management systems that adequately describe the procedures and the internal auditors’ role. Also, most of them apply the international standards for the professional practice of internal auditing, according to management support. Nevertheless, cases of noncompliance with the ISPPIA have been detected.
The implementation of risk-based auditing is a complex, multifactor and delicate procedure. Although the characteristics of internal auditors play a role in this implementation, its effect is less evident compared with issues of quality, review concern, risk management procedures and compliance with ISPPIA, regarding the implementation of RBIA.
The literature has focused on specific variables of RBIA, such as the risk management system or the review concern, while some studies have mainly been concerned with critical factors for the effectiveness of the internal audit function (Alzeban 2019; Alqudah et al 2019; Zainal Abidin 2017; Castanheira et al 2010; Mihret and Yismaw 2007).
This paper extends the present literature by examining additional factors related to internal auditor characteristics, internal audit quality and international standards for the professional practice of internal auditing. The analysis investigates more variables associated with the implementation of RBIA and helps the management and executives focus on the appropriate factors in order to apply risk-based internal auditing.
There are a number of limitations on the interpretation of the results. First of all, there are the general limitations of the questionnaire surveys, such as the respondents’ impartiality. Further, the fact that most respondents work in the private sector means that the results about internal auditing in Greece should be interpreted with caution.
Taking the limitations into account, future study should focus on a specific sector in order to obtain more specialized results. Moreover, further studies in other countries with different legal frameworks could lead to more or different variables for the implementation of RBIA. Finally, research in countries that are facing or recovering from economic crises can lessen the impact of crises in firms’ economic environment.
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.
The authors thank the independent peer reviewers for their consideration, effort and valuable comments, as well as the editor-in-chief of the Journal of Operational Risk, Adjunct Professor Marcelo Cruz.
Buy-side use of average pricing contributed to rash of failed trades and give-ups last March
Everyone was blaming everybody else. In March 2020, a massive spike in volumes led to chaos in the futures markets, with delays and trade breaks pitting banks and brokers against exchanges and clearers – and pretty much everyone against post-trade vendors – in claim and counterclaim. At the time, the only camp to largely escape attention was asset managers.
But in the industry’s ongoing inquiry into what happened, one aspect of buy-side behaviour is being pinpointed as the biggest source of clogs in the industry’s post-trade pipes: namely, a tendency among many large firms to use average price allocation methodologies to distribute gains and losses on trades across the thousands of funds they manage.
When the coronavirus pandemic slammed markets, a huge flood of orders reached the biggest brokers late in the trading day. Many watched powerless as their systems struggled to cope, and several suffered outages, resulting in a large number of trade breaks and missed margin payments.
“The futures allocation workflow leaves a lot to be desired. It’s a little shocking to me the way the process works, still to this day,” says the head of trading at a large US buy-side firm. “You execute, and you stick all the trades in a temporary account. And then you have until the end of the day to allocate the trades. I do think this is a weak spot in the futures market.”
Risk.net spoke to more than 20 banks, buy-side firms, tech vendors and trade groups to piece together how the industry’s trade processing pipes came to burst last March. The problem had built up steadily over time, fuelled by years of underinvestment in systems and infrastructure leading up to the spike in volumes.
In the immediate aftermath, banks blamed exchanges, notably Eurex, where around 15% of give-up contracts were reported late in March. The bourse was criticised for not being ready to extend clearing hours on busy days, something it says it quickly did, while in turn pointing the finger at two banks – understood to be Goldman Sachs and Morgan Stanley – for submitting orders to its central counterparty (CCP) in batches at the end of the trading day, rather than in parallel.
Both banks have subsequently changed their clearing architecture to allow orders to be fired in via parallel processing: Morgan Stanley in the space of a few days, Goldman some months later.
But the problems were not confined to just these firms. Across every broker, there were multiple days when trades could not be cleared to the right accounts. All the while, the fixed income and equity expiries for first-quarter-end – when asset managers roll and rebalance trillions of dollars’ worth of futures positions – was looming. Clearing operations teams focused their efforts on resolving breaks in those contracts first. But as existing breaks were resolved, new breaks were building up.
The futures allocation workflow leaves a lot to be desired. It’s a little shocking to me the way the process works ... I do think this is a weak spot in the futures market
Head of trading at a large US buy-side firm
“As buy-side clients were trying to deal with the volume, they or their administrators were running their average pricing [algorithms] and instructing their executing brokers to give up trades around the Street. There were many instances where brokers were either bumping up against that window or actually missing it, and not able to even give out all their trades on the day,” says the head of futures at one large futures commission merchant (FCM).
Throw in failures on several crucial pieces of tech upon which the whole industry depends – in April, five sources cited problems with systems supplied by FIS, the largest vendor of post-trade derivatives clearing technology – and the result looks like the perfect storm for snarling derivatives trade processing. The resulting logjam took several weeks to clear out: more than one bank feared the entire system would collapse.
Following the meltdown, the Futures Industry Association set up a global working group to analyse what happened, and has found working practices to ensure best execution for investors contributed to the slowdown.
Don Byron, FIA’s head of global industry operations and execution, says: “A lot of buy-sides have increased their use of average price allocation methodologies. Typically, when you do that, you’re going to wait for all of your trading to be done at the end of the day. The later in the day you get it, the more you’re doing in a shorter period of time.”
One idea gaining traction among some of the largest FCMs is the creation of a new post-trade utility, to standardise the tangled flow of allocations from buy-side firms to brokers and CCPs. What that utility would look like, and who would run it, remain open questions (jump to box: A better future).
Talk to enough people, and no corner of the market escapes blame for the chaos that followed the March volatility explosion. But sources say the problems were particularly pronounced at Eurex, and that some banks were at the centre of more broken trades than others, with Goldman Sachs and Morgan Stanley the names mentioned most often.
Morgan Stanley declined to comment. However, independent data shared with Risk.net suggests at least five clearing members saw a high number of delayed trades at Eurex during March. Goldman topped the list; Morgan Stanley was fifth.
As the largest clearer by volume at Eurex on any given day, Goldman bore the brunt of the strains – and the ensuing anger among clients and counterparties. Some 15% of Goldman’s trades at Eurex during March were subject to delays, it is understood – in line with the average seen at the CCP that month – with the vast majority of these resolved within two days.
“We were concerned and disappointed during the Covid crisis regarding the performance of some of our peers. The fact of the matter is, their infrastructure broke down during Covid,” says the head of clearing at a large US bank.
Singling out Goldman, he adds: “There were hold-ups in terms of the give-ins they were sending to us, and also not accepting the give-ups we were sending to them. It did expose us to risk, because they broke down. We were hearing our peers were having challenges with their matching systems primarily. Goldman Sachs were probably the worst offenders. That flow particularly with Eurex and Goldman was very noisy.”
Others say the same. An executive at a European clearing house says Goldman “had real problems”, while three others in the clearing industry attest similarly.
Goldman Sachs has declined to comment.
Eurex admits there were delays, but says two-thirds of trade breaks were resolved by T+2. Without naming names, the exchange blames the problems on two banks that, it argues, did not have the optimal setup to access C7, its clearing system for listed and over-the-counter products. It was those FCMs that had issues processing volumes that asked the CCP to delay the close on certain days, according to Eurex Clearing’s chief technology officer, Manfred Matusza.
According to Eurex, these banks connected to C7’s message queueing system via a single-thread processing session, while other banks maximised their access through multiple parallel sessions, thereby increasing processing capacity.
The CCP says one FCM connected via a proprietary pipe, and another used a third-party vendor.
Risk.net understands the banks in question are Morgan Stanley and Goldman Sachs.
Morgan Stanley dealt with the problem by switching to parallel processing within 72 hours. Goldman made the change months later after deciding it would be imprudent to do so during a period of market stress.
But clearing sources say the CCP itself was not faultless.
“We were one of a number of brokers using single lines into Eurex, and Eurex themselves had not really proactively managed the situation in terms of making sure that there was consistency [of lines] across all brokers,” says the head of derivatives clearing at one bank.
The head of clearing operations at another large bank agrees that connecting with Eurex through multiple sessions brings its own challenges, because data is not segregated per thread: “You internally need to consume data from all of the sources, and then ensure that you are not duplicating those messages within your infrastructure. You would need to have very robust processing internally to ensure that you are not duplicating that data.”
Another broker says Eurex, in the first few days of the volume spikes, was “not amenable” to extending the batch window, but subsequently became more accommodating as backlogs built up.
But for all the sound and fury in March, many are worried the industry has been storing up a more deep-seated problem for itself for a long time. Clearing brokers say that the widespread use of average pricing methodologies among buy-side firms contributed massively to the end-of-day traffic jams – and that the outages suffered as Covid loomed were the tipping point.
The basic lifecycle of an average price order is simple enough. When a trade comes in from an asset management client, their executing broker will hold the positions in a so-called suspense account – a mechanism that allows them to temporarily pause the order. Only once the average price of a trade is known – usually at the end of the trading session – can orders be allocated equally across the funds that the client manages.
“As soon as those fills come in, they are booked to what people often refer to as a suspense account, which is client-specific, and held there awaiting an allocation instruction. When that allocation instruction comes in, it is then matched with fill records to make sure it’s one and the same, and then the necessary bookings are done,” says the head of futures at a second Tier 1 bank.
After clients submit their allocation instructions at around 5pm or 6pm, the FCM’s processing cycle can begin. Everyone agrees this workflow is clunky at the best of times. However, in March, allocating overwhelming volumes of trades late in the day on successive days rapidly became impossible.
Give-up trades – in which a client sends an order to their chosen executing broker, which then gives it to another broker to clear – were a particular trouble spot. In an average price order, the executing broker is responsible for communicating back the fills to the clearing broker. The client delivers the allocation instructions to both the executing broker and the clearing broker when the order is complete.
In March, clearing brokers received allocation instructions for large orders late in the day. The head of futures at a US bank says delays in pushing give-ups had a “knock-on impact” across the industry, causing delays in trades shared by FCMs.
Matching broken give-ups was the focus of the industry’s collective clean-up efforts. That wasn’t easy: some asset managers were left with positions that were not corrected before the month-end roll. One buy-side firm says it had two people working 12 hours a day for a fortnight to try and match up broken trades – a process that normally takes one person an hour a day. The errors, says a trader at the firm, took two months to be compensated.
The head of clearing operations at a second large FCM says there were thousands of trade breaks across the industry. An asset manager, meanwhile, says delays in futures margin processing and trade matching hindered best-execution requirements, adding that allocation issues were exacerbated by quarterly dividend risk.
But if the current workflow is suboptimal, it is not one the buy side is in a hurry to change. Asset managers both large and small say there are good reasons for their use of algo-driven average pricing – most of them rooted in ensuring the best price for their end-investors, often by dint of best-execution mandates.
“If you were to trade over the course of the day for multiple funds in any single stock equity future, you would want to have the same price at the end of the day so that no one customer or fund receives a worse price than another. Everything that we trade, we allocate at the end of the day. I think it would be difficult to mitigate against,” says Alex Jenkins, head of dealing at Polar Capital, a boutique investment firm.
Paul Squires, regional head of trading at US giant Invesco, agrees. Often, the firm’s traders need to work an order for several underlying portfolios that a fund manager is responsible for, he says: “[A trade] comes through to us as a single order. But there will be different individual fund splits, because of the different sizes of the portfolios. The trader may deliberately work that order in pieces over the whole session. His average price on that trade is only going to be realised at the end of the day.”
Those pieces can quickly become fragments: an order to buy 200 index futures contracts, for example, may result in 140 or 150 individual fills by an executing broker. And as volumes and the number of fill lines increase, the message traffic that needs to be processed by a broker’s matching engine – matching orders with fill messages, before reconciling the exchange clearing line with a client allocation – increases exponentially.
“Anything that goes wrong in any one of those channels causes a lot of headaches,” notes Squires.
Such a workflow – which can only begin around or after the end of the increasingly lengthy trading day – would be complicated enough. But some FCMs – including Goldman, it is understood – then choose to wait for confirmation from the client before they take in trades from the executing broker.
In other cases, clients simply ask the clearing broker to accept the trades that are alleged in their name without waiting for a client allocation file or confirmation of the trade for matching, if the trade has a valid reference. The method is called ‘carte blanche’, and the head of derivatives clearing says up to 30% of his bank’s flow with clients uses this, but it needs strong reconciliation processes between a client and its brokers, and good levels of straight-through-processing.
The more laborious match-based model – which can slow trade processing with additional checks – is preferred by some FCMs, which argue it is more operationally robust, and ultimately leads to fewer reconciliation breaks – and it is what some blue-chip clients prefer.
The clunkiness of the futures workflow used by asset managers is compounded, say multiple sources, by a wide range of operational pipes connecting FCMs to the buy side – workflows that can be highly bespoke. Not all clients have adopted the Fix protocol, which allows faster processing of message acknowledgements, for example. Allocations may instead be communicated by: emailing a spreadsheet at the end of the day; through a file from a middleware technology provider, which may be different to the one used by the FCM or CCP handling the trade; or even by voice.
Some clients are unable to send order IDs – critical for matching purposes – in an efficient manner, complains the futures head.
The head of clearing operations at one large US FCM says the entire process could be standardised through an industry utility: “There are multiple ways of that allocation coming in, multiple ways of the FCM interpreting that allocation, normalising it and then doing the clearing process at the CCP.”
While current middleware technology was blamed for its part in the processing breakdown, in the aftermath of the event, post-trade giant FIS told Risk.net that industry budget trends – downward pressure on revenues on the sell-side and an increase in regulatory-driven costs for banks – have resulted in under-investment in derivatives trade processing capacity in the decade since the financial crisis.
This has meant banks have not sufficiently spent on “infrastructural buffer, processing power buffer or even operational buffer”, says John Omahen, head of post-trade technology product strategy at FIS. He suggests banks have been operating at the limits of their capacity for so long that, when volumes spike dramatically, there’s little or no margin for error.
If someone came along today and said ‘my business model is: I take two days to allocate’, you’d just be laughed at
Senior executive at a post-trade vendor
That jumble of pipes and plumbing is certainly problematic, says FIS. The vendor points out that APIs between banks and clearing houses – the technological ‘handshake’ for trades – vary and some are not optimised for high volume processing. What’s more, allocation messages – especially those requesting trades passed from one clearing member to another during the day – typically flow through different pipes and different middle-office systems.
The futures head adds that CCPs don’t always support the allocation process and average pricing in the same way. CME, for instance, introduced an alternative, called notional value average pricing, in 2018.
The head of clearing at a US bank adds that investment in – and testing of – systems varies by bank. He says his bank stress-tests its infrastructure through a series of scenarios to make sure it can understand the operational throughput capacity. However, he is less confident his peers do the same.
And as firms have been left to plan capacity on their own, an underlying issue is that, while some banks had the discipline to make sure their pipes were sized for something extraordinary, some didn’t, according to the utility source.
Nonetheless, lessons have been learnt across the Street. The backlog prompted technology teams at affected firms to examine their processing capacity.
“It probably redefined what a peak day looked like for the industry,” says the utility source. “Each firm has been addressing the capacity challenges they’ve seen, including working with clients around how they allocate and how they confirm in order to try to move more processing to real-time trade day, rather than the end-of-day or T+1 cycle that has often been the case in futures.”
“You can pin it on the technology, you could pin it on the clients, but it was just a perfect storm of the end-of-day cycle being compressed relative to the volume of trading that was going on,” he concludes.
What might a better allocations workflow for futures and options trades look like? There are broadly two schools of thought within the industry: rearrange the existing post-trade pipes and remove some of the joins, so that they flow more efficiently; or rip them out and replace them with something else.
To many confronted with the current tangle, wholesale replacement might sound appealing. “If someone came along today and said ‘my business model is: I take two days to allocate’, you’d just be laughed at,” says a senior executive at one vendor in the second camp.
As one senior futures and options technologist at another vendor sees the second scenario, rather than clients sending separate allocation instructions to their original executing broker and their clearing broker, a single venue or hub could sit in between each party, so that they see the instruction at the same time – similar to the function MarkitServ plays in the bilateral and cleared over-the-counter rates workflow at the moment.
At present, in the vast majority of cases, a cleared client interest rate trade executed with a dealer is submitted as a single trade into MarkitServ, before clients submit allocations to match against the executed swap submitted by their dealer. The matched allocations are then submitted to the central counterparty (CCP), referencing the account for each fund in the allocation. Once the CCP receives them, the clearing broker for the client’s funds receives a message for each attempted allocation, and accepts or rejects each one for clearing.
For swaps trades on electronic venues, the workflow is different, given that clients pre-allocate the majority of those trades before submitting them to the order book. On any given day, roughly three-quarters of the global rates market flows through MarkitServ’s platform, it is understood, with the figure rising to 90% on some days.
As with any major change to workflows, however, wholesale replacement comes with a number of hurdles: most obviously, persuading all parties – particularly clients – to sign up to a new utility in significant enough numbers that it becomes effective and a worthwhile investment.
One way of rearranging existing workflows, says Jo Davies, head of Traiana – one of several vendors with a dog in the fight as the industry mulls its options – would be to take out the middleman in the form of executing and clearing brokers, and have clients communicate their allocation instructions directly to the CCP.
Such a setup, she argues, “takes out all the hops, skips and jumps of allocating to an executing broker, which slows down the process”, while acknowledging it would depend on “a common set of transaction data standards that the futures commission merchant, executing broker and client agree on”.
Still, Davies believes this “could be more easily achieved” than convincing clients and brokers to sign up for a new utility.
Additional reporting by Costas Mourselas; editing by Tom Osborn
*Update, February 10, 2021: this piece has been updated with more information on which futures commission merchants suffered delayed trades at Eurex during March 2020.
Flexible checks and balances at bank's asset management arm kept risks within tolerance
As Nordea Asset Management stared down the barrel of Covid-19 in early 2020, its chief risk officer, Søren Andersen, quickly began to map the various risks the firm would face. Employees working remotely, third-party vendors unable to deliver services and – not least – cyber attacks, were all in his sights.
It quickly dawned on him that the risks had one thing in common: they relied on key controls to function to plan.
“Controls were my main concern. Processes work, but controls are usually in an office, maybe sitting next to someone. Now, with people spread around, that control environment could be harmed,” says Andersen.
Very early in the pandemic, controls became a group-wide focus; management gathered to determine who would be responsible for handling new incidents and what training would be needed.
In late January 2020, Nordea Asset Management’s price and risk committee – stakeholders from risk management, investment management and trading – began holding weekly meetings and prepared for the worst.
By late February, the bank had formed a group-wide crisis management committee. Individual country committees began to hold frequent meetings, with Andersen on the Denmark committee. Those responsible for marking positions to determine net asset values met multiple times a day.
The manager prioritised critical controls, such as those for traders. “We have specific areas within operations that involve new products and our AUM [assets under management] is growing, so we need to make sure our traders can turn those positions,” says Andersen. “With liquidity in the market going down, it’s a challenge.”
Like other asset managers, Nordea had to determine whether information security might be compromised if people worked from home. In February, Andersen addressed information security, testing viability of equipment and virtual private network connections. Networking capability was scaled up so that all 20,000 Nordea Group employees could have access to its systems.
I was concerned about the controls environment. But it turns out that it actually worked fine
Søren Andersen, Nordea Asset Management
In March, the company sent home 50% of its people and all units were asked to prepare a plan for 80% of staff working remotely, which was tested with only 25% of staff on-site. The firm implemented a range of ‘soft’ controls – awareness sessions, where employees attended sessions focusing on cyber security, resilience, malware, securing information – pushing out critical control information to staff and training managers in how to remote-manage. One of its main scenarios was the loss of offices.
The plan seemed to work. The percentage quickly became 80%, with those who needed to be in the office, including trading, IT and investment management, shunted to meeting rooms converted into ad hoc office space.
Controls were also an issue for vendors and IT service providers. Andersen upped his regular vendor interactions, and the team devised plans to cover work in case vendors were disrupted.
Early indications are that Andersen’s work has paid off. While the firm saw a lot of new incidents around cyber and malware attacks, no systems were breached.
“I was concerned about the controls environment,” says Andersen. “But it turns out that it actually worked fine. When I look at the 2020 statistics, our financial losses due to incidents were fairly normal,” says Andersen.
While trading-related incidents were a little higher than normal – problems with passive limit breaches due to the very volatile markets, for example – in fact, the firm’s losses were broadly average, both in number and financial losses. And vendor services were uninterrupted.
Nordea Asset Management, which is fully owned by Nordea Group, has €254 billion ($306 billion) under management, most of it Ucits and alternative investment funds. It also handles separately managed mandates from institutional investors. Its products are actively managed, most of them in fixed income, but it also sells multi-asset equity funds.
Andersen joined the firm as CRO in mid-2016 following stints in risk management at SEB in Denmark and with several pension funds. He is responsible for the overall second-line risk function, managing teams of risk professionals in Denmark, Sweden and Luxembourg. His mission is to improve the governance and risk framework as well as the overall risk culture. This means harmonising Nordea Asset Management controls with those of its parent, while also meeting regulatory requirements.
On joining, Andersen set about creating a risk management framework independent of Nordea Group, exploring how to work without cleaving to bank-orientated governance irrelevant to the asset manager.
“When I joined, I experienced an organisation with a strong risk culture,” says Andersen. “However, we had challenges fitting in with the bank framework that was defined by Nordea Group.”
As owner of this relationship with IBM, I have the freedom to decide when to [undertake] development with the system
Andersen developed arms-length principles, examining each Nordea Group directive and policy for its relevance to the asset manager – if there is none, he creates his own.
The asset manager has its own change and incident management processes, IT system, business continuity plans and IT organisation – as well as its own legal and compliance departments. Nordea Group supplies its network, but the manager has its own local support.
It isn’t an exercise to be different at any cost, says Andersen, but to add value where the mandates differ.
Objectivity is key. The manager recently bought a new system for incident management and risk control self-assessments (RCSAs). Among the vendors it examined was Nordea Group – but it bought IBM, which offered the best fit for asset management processes.
“As owner of this relationship with IBM, I have the freedom to decide when to [undertake] development with the system,” says Andersen. “That freedom is extremely important to run our business. It materialises in many other places.”
Andersen’s main focus is operational risk, as well as business model and environmental, social and governance risks. Market and investment risk is delegated to a small team in Luxembourg, which serves as ombudsman between clients and product managers, ensuring investment products have market and liquidity risk safeguards built in.
He has embedded second-line risk managers within the businesses they oversee without sacrificing independence. “We have built that culture, which I’m proud of, but it does place responsibility on us to remain independent.”
In Andersen’s view, the value in a risk organisation that is close to the business is the ability to have regular interaction with key business stakeholders. This includes carefully screening incident reports and rejecting them if they are not clear enough. They can also engage closely in change projects and be a partner in business continuity discussions.
Another clear example is in the RCSA, where the risk team plays a more active role. In its self-assessment, the team needs to be independent, but the process is improved when it can assist the business with documenting and assessing risks. These concepts can sometimes be challenging – the common language is important.
“By pushing this, we have succeeded in getting to a stage where the business remembers and contacts us whenever risk-related topics are on the agenda,” says Andersen. “It is my belief that with this setup, we can better promote the strong risk culture we want.”
Much of the team’s time is devoted to mapping operational risks to the most significant business processes, where banks have also been heavily engaged. Andersen and his team have built a risk library that links to all these processes so they know which controls to mitigate and which risks may arise.
Every incident, including near misses, is documented. Operational resilience is a major focus; impact analysis and continuity plans are updated yearly and tested. Like many banks, it employs scenario analysis to try to look at rare but high-impact risks and prepare to mitigate those risks.
Most of its operational risks are investment breaches – they include trading errors that occur infrequently but have a high impact when they do. “We spend a lot of our time trying to mitigate trading, limit checking and compliance breaches,” says Andersen.
Nordea has also developed a framework for dealing with regulatory risk. A case in point is the European Union’s Investment Fund Directive and Investment Fund Regulation (IFD/IFR), which requires the use of so-called K-factors that measure the risks to clients, the market and the firm itself. Firms where AUM exceeds €1.2 billion ($1.42 billion) or that breach certain other metrics, such as client orders handled, are subject to K-factors.
The new regulation requires that operational risk capital for asset managers be the higher of 25% of fixed overhead – the current standard – and the amount using K-factors. When the regulation was first proposed, Andersen received an analysis of its potential impact from a team of specialists that scans new regulations.
In the case of IFD/IFR, the team determined that risk to clients was the principal source of operational risk, and calculated K-factors based on AUM related to mandates – about one-fourth of overall AUM.
It conducted a trial of the new regulation and saw that the fixed overhead requirement was the dominant factor. So, for Nordea Asset Management, IFD/IFR will not have any effect on operational risk capital.
“With the mix of mandates and funds we have it will not have an effect, because the K-factors don’t add up to anything near the fixed overhead requirements,” says Andersen.
With K-factors, as with all else, Andersen has dotted the ‘i’s and crossed the ‘t’s.
Risk Awards 2021: bank avoided tech snags and margin call surprises that plagued peers during crisis
The extreme market volatility unleashed by the Covid-19 pandemic last year tested bank clearing businesses to the limit, resulting in operational mishaps that hurt both the banks and their clients. JP Morgan is a rare exception.
The seeds of the bank’s success were sown years ago when it started investing in its clearing technology platform and its front-office risk management team. In 2020, the platform did not buckle when the market turmoil led to a steep rise in clearing volumes, while JP Morgan’s risk team monitored a host of metrics to ensure the bank had funding for potential large margin calls.
This did not go unnoticed by clients.
JP Morgan’s clearing service was “rock solid” during the chaos of March and April 2020, says an executive at a large UK-based asset manager who works with several clearing banks.
“In days where markets were moving to the tune of multiple standard deviations, JP was very calm,” he adds. “Others were getting increasingly nervous about getting cash in the door.”
A senior source at a mid-tier bank that clears through JP Morgan agrees, saying there was “no stress [from JP Morgan] to get cash from us” for margin calls from central counterparties (CCPs).
In March, after a jump in trading volumes in futures and options, operational bottlenecks developed in trade processing. There were missed margin payments, and give-up trades left on the books of the wrong clearing brokers as a result of delays. Some clearing brokers were left with uncollateralised overnight exposure to buy-side clients. Bank of America even feared “CCP systemic issues”.
No part of the trade processing system – clearing houses, clearing banks or software vendors – has escaped the blame. The head of clearing at a large US bank is blunt about the role that some of the bank’s peers played in the gridlock: “Their infrastructure broke down during Covid.”
Not at JP Morgan, though. There “things went smoothly” in March and April, says the senior source at the mid-tier bank.
In days where markets were moving to the tune of multiple standard deviations, JP was very calm. Others were getting increasingly nervous about getting cash in the door
Executive at a large UK-based asset manager
“During the peak period, we noticed that they had some delays in processing give-up positions but nothing that caused any issues on our side. It was addressed by them in close to real time and I would say it was acceptable, given the volumes,” he adds.
In March, clearing volumes reached a record for many benchmark equity, credit, interest rate and commodity benchmarks across multiple exchanges. At JP Morgan, client clearing volumes were three times average daily volumes of 2019 at the peak of the coronavirus crisis.
The operational backlogs that afflicted cleared trades also led to a large number of trade breaks. This then required a time-consuming effort by futures commission merchants (FCMs), or clearing banks, to match them up again.
The task was made easier at JP Morgan thanks to a tool developed in-house, called Intelligent Analyzer. Using machine learning techniques, the software scans the information emailed in by clients and identifies trade breaks.
“When the pandemic hit, we were able to scale up [the tool] reasonably quickly to extract trade break material,” says Anthony Fraser, head of global clearing operations at JP Morgan. “The ability to speed up our extraction of key information and the investigation process was a really important contributor to our ability to provide transparency to clients on what was happening with their breaks.”
One constraint on Intelligent Analyzer’s speed was the system of the counterparty to each broken trade since resolving such trades requires the agreement of both parties to the transaction.
It wasn’t just the investment in new technologies that paid off for JP Morgan during the most frenetic trading periods of 2020. A growing trend towards electronic execution over the past five or so years had been a cue for the bank to expand the capacity of its established clearing technology, “to ensure that everything flows through effectively”, says head of clearing Nick Rustad.
The rise in electronic trading has been accompanied by greater use of trading algorithms, many of which execute orders near market close. This means there is now more clearing to be done towards the end of the clearing day.
On top of that, exchanges have extended trading hours in recent years, producing a double whammy of higher trading volumes and a shorter window between the end of the trading day and the end of the clearing day when transactions can be cleared.
JP Morgan’s ability to support high clearing volumes proved particularly useful during the crisis. “As market liquidity deteriorated during that period, we saw a shift in execution from voice or block into more electronic execution,” Rustad notes.
In addition to robust technologies, JP Morgan benefited from a thorough approach to risk management during the Covid-19 crisis. At a time of funding stress, some clearing banks had to scrape together the cash to meet surprise margin calls. But JP Morgan’s risk team, together with relationship managers and the treasury team, worked to ensure no large margin calls came as a surprise.
This risk team kept a close eye on daily metrics across cleared client portfolios, such as the contingent risk of market moves, intraday liquidity and live client profit and loss, among other data. The team also monitored the liquidity risk of JP Morgan’s clearing business.
This approach chimes with that recently proposed by Pedro Gurrola-Perez, head of research at the World Federation of Exchanges.
We’ve been delivering a consistent message for a number of years. We’ve been committed to the clearing business when others wobbled
CCPs’ procyclicality mitigation tools, such as margin floors, are not sufficient, he argues in a January 12 paper. Other measures are needed, he writes, including liquidity management by market participants that takes into account the possibility that margin calls and requirements may soar during periods of stress.
JP Morgan’s clearing business avoided a liquidity crunch last year also because the bank had earmarked a large amount of cash and other safe assets for meeting intraday margin calls. The scale of this financial commitment partly reflects JP Morgan’s confidence in its profitable clearing unit.
“We’ve been delivering a consistent message for a number of years,” Rustad says. “We’ve been committed to the clearing business when others wobbled.”
Half-hearted attempts at client clearing include those by Royal Bank of Scotland. In the middle of the market maelstrom in March, the bank shut its client clearing and execution business for futures and options trading. A source familiar with the unit said at the time that it had suffered from underinvestment and “death by a thousand cuts”.
JP Morgan’s commitment to the clearing business is underlined by its extensive research on various aspects of clearing, such as CCP margin models and product margin levels. The upshot is not just better-informed risk management by the bank but also direct benefits for clearing clients.
“When it comes to a number of what-ifs, like what if a customer goes down, what happens if another FCM at a CCP goes down, we get the best answers from JP Morgan,” says a source at a European asset manager.
The worst of the coronavirus market turmoil may be over, but JP Morgan is playing a prominent part in banks’ efforts to strengthen the clearing system ahead of the next crisis.
Banks have zoomed in on a spate of contract-level margin breaches in March 2020. Such a breach occurs when initial margin (IM) for a contract falls below the marked-to-market value of that contract.
The FIA, which lobbies for the futures, options and cleared derivatives markets, appointed JP Morgan’s Rustad as its chair in October and, under his leadership, published recommendations to improve CCPs’ margin models.
The suggestions include introducing margin floors to prevent margins becoming too low during calm periods. Low peacetime margins raise the risk of sharp IM hikes when markets hit a rough patch.
“I think the crisis really exposed significant flaws in CCPs’ initial margin models, particularly for exchange-traded derivatives,” Rustad says. “The size of these IM increases… they were procyclical and exacerbated the crisis with negative spillover effects into broader market.”
Risk Awards 2021: new analytics dash helped bank get ahead of op risk breaches during Covid crisis
When guarding against operational risks, it is sometimes better to be lucky than good – and better still to be both. As Covid-19 swept across the globe in the early months of last year, banks were exposed to huge increases in the operational threats they faced, as employees and clients alike decamped en masse almost overnight to work from home.
A matter of months earlier, Credit Suisse had begun deploying a new dashboard for non-financial risk (NFR) management analytics – DNA, for a short – a smart, tech-driven solution to bolster its op risk analytics and monitoring capabilities. The dash’s outputs were designed to give the bank a more forward-looking gauge of its op risks, based on monitoring controls in real time, and reducing the lag in responding to and shutting down operational incidents before they turn into potential loss events.
But the timing of the rollout also meant it became a key plank of the bank’s response to Covid, helping it actively manage the disruption to its almost 50,000 employees, and informing more than a dozen practical initiatives it embarked on in the days and weeks that followed. Many of these were rooted in risk identification and appetite: monitoring traders’ risk limits, for instance, at a time when banks were seeing hundreds of soft breaches as markets gyrated wildly.
“We were able to pivot within days – to pull together components of our programme both to bring in information externally and internally, and present it under very intense, stressful scenarios where you had high market volatility,” says Jim Barkley, head of non-financial risk at Credit Suisse. “You had individuals at every level of the organisation – our CEO, our executive board, our board of directors – all wanting real-time information throughout the course of that. This tool allowed us to pull that together very quickly – and that was only because we built it in such a way that it could leverage those analytic capabilities and the [underlying] data platforms,” he adds.
All the more impressive is the DNA dash’s development story: from the project’s inception meeting, a core team of 12 engineers, working in partnership with two external vendors – one aiding the bank with visualisation elements, the other using big data analytics to leverage and make sense of the 17 different data sources the dash pulls information from – took just six weeks to build the dash’s initial deployment.
At the height of the Covid crisis in March, when volatilities were elevated, Barkley and his team were getting live, rich data as breaches occurred, and sharing it in real time with the bank’s desk-level, divisional and regional controls heads – the first line executives charged with monitoring front office staff, many of whom were now working remotely – allowing them to zero-in on breaches in real time.
As a result, processes that would previously have taken place weekly – monitoring risk limits, for instance, but also signoff limits on asset valuations, with liquidity gapping and pricing models nullified – were able to be shifted to daily, or intraday.
We were able to pivot within days – to pull together components of our programme both to bring in information externally and internally, and present it under very intense, stressful scenarios where you had high market volatility
Jim Barkley, Credit Suisse
“They were able to zone down to a specific desk,” adds Barkley. “In some cases, they had calls two to three times a day to determine how those controls were doing.”
That would have been help enough in the midst of a crisis – but the bank’s developers went one better, building a Covid adjunct to the dash within five days flat, giving risk teams information on newly monitorable metrics such as Covid infection rates in its key jurisdictions, internal infrastructure stability reports and business continuity information.
Equally important, Barkley told Risk.net in the advent of the crisis, was the ability to analyse incidents not in isolation but with a view to how they could occur elsewhere: because all the bank’s controls are aligned to risk registries, it can immediately see if that control is impacting other divisions, Barkley noted at the time. For example, if a breach occurs on one of the bank’s equity trading desks, Barkley’s team can look at the key controls that were involved, as well as the linkages between them and those affecting other desks.
Pre-DNA, that might not have happened, he notes: the bank’s enterprise risk controls framework provided a top-down view, but usually siloed by business function, with historically less sharing of knowledge and controls, making it difficult for senior managers to gain a holistic view of the risks facing the firm. That in turn meant its risk control self assessments were less informed than they would be now.
“The way that risk appetite is set in the operational risk space is very much backward-looking. It’s based on incidents that happen, and how that impacts the bank with regard to loss profiles. That’s all very, very important – but, if I’m running a business, I want to know my projections going forward,” argues Barkley. “I like to think about operational risks like running a hedge fund; I want to have all the information in front of me flashing, and I want to know my potential impacts.”
Barkley, who earlier in his career was a bank prop trader, got his wish sooner than imagined: “A year ago, we were all talking academically about what could happen if you had a pandemic, or what could happen if markets go to a certain level” – then in March, “it all just happened”.
We start by providing an overview of this special issue. The main aim of this issue is to pull together both empirical and conceptual studies on different social actors of financial crimes in diverse sectors, to develop novel insights on various socio-political constructs of financial risks and crimes to predict, detect and prevent such crimes, to maintain financial stability. We first describe and attempt to synthesize the contributions of the papers in this special issue. We next observe that, while these papers do have some connections to financial fraud risks, they do not engage directly with the emerging relationship between financial fraud risks and financial stability, but shed light on existing scenarios related to this relationship. This introductory paper therefore offers an in-depth account of the emerging financial fraud risks and their potential impact on financial stability. We expect that the more formal consideration of current thinking on financial fraud risks will provide an impetus for a better understanding of the importance of the detection and management of financial fraud risks to preserve financial stability. In turn, we hope that it will become apparent to those involved in financial fraud risk research that understanding the routines, actions and outcomes of financial fraud risk management within organizations is likely to be a highly prolific activity.
A transparent and competitive financial management policy and practice is, generally, instrumental for local, regional and global financial stability. However, different types of fraudulent financial activities, such as misappropriation of funds and Ponzi schemes (ie, deceitful investments, insider dealings), bribery and corruption, money laundering, online banking identity threats, investments in support of any antisocial issues and market abuse have from time to time destabilized global financial stability. For example, “money laundering has become of increasing concern to law makers in recent years” (Norton 2018, p. 56). The damaging impact of “Ponzi schemes [on financial stability has] not been limited to one country [or region]” (Kethineni and Cao 2020, p. 5). In particular, “Ponzi-like investment schemes were popular in many transition economies. Often, some government officials had inside information about the viability of such schemes and used this information to their own advantage” (Sadiraj and Schram 2018, p. 29). In order to mitigate the ongoing online identity threats and associated privacy risks in different sectors, including the online banking sector, in recent years the OpenID concept has received considerable interest to “solve identification, authentication, authorization and accounting (IAAA) [problems] with one unified flow and two tokens; making logging [in] easier, safer and more secure when compared with previous solutions” (Navas and Beltrán 2019, p. 1). However, Navas and Beltrán (2019) have also identified 16 different online attack patterns that would undermine the online OpenID security system.
Regarding online banking security risk, if we look at the current management system of the global financial securities markets, we can see that there are concerns and an ongoing debate about the stability of the securities markets across the globe. For example, while the evidence on the social utility of insider trading is mixed, the conclusions about (financial) market abuses are rather obvious: frauds harm the integrity of financial markets and disrupt the mechanism of efficient allocation of financial resources (La Porta et al 1997, 1998; Easley and O’Hara 2004; Djankov et al 2008; Aitken et al 2015; see also Cumming et al 2018, p. 130).
It is evident that the threats from different kinds of financial frauds have, from time to time, destabilized the global financial system. However, we are not well equipped to combat such threats in order to stabilize the global financial industry in a sustainable way; as the extant literature demonstrates, we have limited knowledge of financial criminology to detect (or forecast) financial crime as early as possible, in order to combat fraud in this industry. For example, “what actually constituted market abuse in securities laws has been inconsistently defined across countries, thereby making analyses of what works in detecting market abuse [risk] rather intractable” (Cumming et al 2018, p. 130). In terms of Ponzi schemes, Hofstetter et al (2018, p. 18) argue that “what we find in the literature on Ponzi schemes remains, to a large extent, anecdotal”. And understanding “how [Ponzi] schemers build and maintain trust may help prevent or uncover the fraud earlier, limiting [the] financial devastation endured by unsuspecting investors, as well as [the] externalities inflicted on the financial system as investors lose trust” (Carey and Webb 2017, p. 589). However, “little attention has been given to how [a Ponzi scheme] offers new opportunities for illegal entrepreneurs to defraud investors” (Baucus and Mitteness 2016, p. 37), which distracts us from effectively predicting latent Ponzi schemes.
In terms of policing money laundering, Verhage (2017, p. 477) argues that there are “great expectations but little evidence” in effectually regulating different money laundering schemes. In fact, “there are difficulties in assessing the epidemiology of financial crime, and of money laundering in particular; information is scattered, fragmented, or missing” (Verhage 2017, p. 479). Although the “rule-based” concept of anti-money-laundering (AML) initiatives was not effective enough to efficiently prevent money laundering, more recently these initiatives have become more popular for combatting money laundering. In the Financial Action Task Force’s recommendation, “all the actors involved in the prevention of money laundering … have been asked to shift from a rule-based paradigm to a risk-based approach, allocating their AML efforts where the risk of money laundering is higher” (Savona and Riccardi 2019, p. 1). Regrettably, this call has so far been unable to prevent money laundering crimes across the globe, since “the result [of these reformed AML initiatives] is a patchwork of diverse exercises [using the risk-based concept, that are] often generic, difficult to compare and not always relevant for investigators and practitioners” (Savona and Riccardi 2019, p. 2). Further, besides the financial sector, “money launderers are [also] increasingly moving their activities to the non-financial sector” (Friedrich and Quick 2019, p. 1).
Alongside traditional money laundering practices, financial risk and crime have also increased with the rise of e-commerce platforms (Trequattrini et al 2016; Andraschko and Britzelmaier 2020; Mostafa 2020; Camilleri 2020; Nair 2020). For example, “according to research by the University of Cambridge, some 3 million people are estimated to be actively trading in cryptocurrencies today, and many are already using crypto to pay for items such as hotels, games and even their rent” (Orme 2019, p. 8). However, “the risk of cybercrime posed by digital currencies (eg, cryptocurrencies) is high due to its role in directly enabling cyber dependent crime” (HM Treasury 2017, p. 44; see also Jones 2018, p. 1). As a result, digital currency not only is becoming popular, but also “presents new regulatory challenges” (Latimer and Duffy 2019, p. 121). Jones (2018, p. 1) further adds that “cryptocurrencies, gambling, money laundering, illicit financing, [the] narcotics trade, [the] slave trade, sex trafficking, endangered species trafficking and the mafia, are all intertwined in the underbelly of crime … [and] require further and a more in-depth investigation”. As a result, there is an urgent call for more profound and perceptive research in financial forecasting that will have practical implications in effectively predicting financial market trends in order to detect diverse fraudulent financial risks in different sectors (Ahmed 2017; Shams et al 2018; Kethineni and Cao 2020; Maiti et al 2020). In this context, this special issue focuses on insights into
how to detect or forecast risk(s) associated with organized financial crimes as early as possible,
establishing alternative regulatory mechanisms as an antidote to online and offline financial fraud, and
maintaining financial stability using these mechanisms.
Investigating, detecting and preventing financial risks and crimes is “a very complex task requiring [us] to process huge amounts of data coming from different sources such as billings and bank account transactions in order to gain knowledge useful for an investigator” (Dreżewski et al 2015, p. 18). However, in the financial criminology literature, alternative sources of data for analyzing financial crimes are often ignored. For example, “a large body of literature addressed the problem of money laundering (Reuter and Truman 2004), many times ignoring the fact that illegal transactions require the interaction between social actors” (Colladon and Remondi 2017, p. 49). The analysis of a social network (Dreżewski et al 2015; Vrontis et al 2018; Akter et al 2020) of organized financial crimes would be instrumental for developing greater insights into the different social actors of financial risks and crimes in order to gather relevant data about the social constructs of such risks and crimes (Colladon and Remondi 2017). However, the “use of social network analysis in AML” to effectively expose the sociopolitical setting of financial risks and crimes, in order to identify the social constructs of organized financial crimes, to detect financial risks and prevent such crimes more efficiently is “relatively new and not yet fully explored” (Colladon and Remondi 2017, p. 51).
In this context, this special issue presents a collection of empirical and conceptual studies on different social actors of financial crimes in diverse sectors, in order to develop novel insights on various sociopolitical constructs of financial risks and crimes to predict, detect and prevent such crimes, with an aim to maintain financial stability. In particular, the aim of this special issue is to explore research insights that bridge the research gaps in the sociopolitical constructs of financial risks in order to predict, detect and prevent such crimes, and to explore theoretical concepts with practical propositions, based on both the implications of (single disciplinary or cross-disciplinary) theories on financial risk analysis practice and practice-based theorization.
This special issue called for research papers on sociopolitical-theory-driven conceptual arguments and empirical evidence on the determinants of prediction, detection and prevention of financial fraud activities, as well as practice-based theorization of such activities. Given the sensitive nature of the topic, the response was concerning, with no submission directly taking on the challenge of deriving a new theory based on a practical understanding of financial fraud activities or its management within companies. However, the submissions did focus on the detection, prevention and implications of financial fraud activities. In this section, we summarize the papers in this special issue, which shed light on this core focus.
The paper by Lois et al (2021), entitled ”Critical variables in the implementation of a risk-based internal audit: a theoretical and empirical investigation of Greek companies”, investigates the determinants of the implementation of the risk-based internal audit in Greek companies. The risk-based internal audit, which plans internal auditors’ activities based on organizations’ inherent strategic risks, could play a crucial role in preventing fraudulent financial activities (Naheem 2016). The role of the risk-based internal audit may become more important in the circumstances of a financial crisis and its recovery, making Greece an interesting context for studying the factors affecting the implementation of the risk-based internal audit. Using primary data collected by survey questionnaires, Lois et al find that the quality of the internal audit, the review concern, the risk management system and compliance with international standards for the professional practice of internal auditing are positively and significantly related to the implementation of a risk-based internal audit. This paper contributes to the literature on the internal audit as a mechanism to prevent fraudulent financial activities by identifying additional explanatory variables that affect the implementation of risk-based internal audit in an important context: Greece.
The paper by Rossi et al (2021), entitled “The strange case of the Jet Airways bankruptcy: a financial structure analysis”, evaluates the ability of different financial models to predict bankruptcy and fraudulent financial reporting risks by using Jet Airways as a case study. The early prediction of bankruptcy and fraudulent financial reporting risks could enhance financial stability by providing timely signals of bankruptcy, aiding managers in devising financial restructuring and oversight strategies that could save a firm from ultimate bankruptcy. Rossi et al find that Jet Airways’ bankruptcy could be predicted accurately by the combined use of Altman’s -score and Piotroski’s -score. However, their analysis of Beneish’s -score does not indicate any evidence of fraudulent earnings management by the management of Jet Airways. Rossi et al therefore conclude that the failure of Jet Airways was probably due to managers’ incapacity to handle the financial model of the company efficiently. The findings of this paper indirectly suggest that the management of Jet Airways could have avoided bankruptcy if they used the financial models in this paper and devised their financial model accordingly.
The paper by Mazumder and Sobhan (2021), entitled “The spillover effect of the Bangladesh Bank cyber heist on banks’ cyber risk disclosures in Bangladesh”, about the first and most severe cyber theft from Bangladesh’s central bank, Bangladesh Bank, reminds us of the cracks and vulnerabilities in the international banking system. Their study thus expands our understanding of the implications of cyber theft by analyzing the spillover effect of this appalling cyber theft on the cyber risk disclosures of the banking sector in Bangladesh. They identify the Bangladesh Bank cyber theft as a legitimacy-threatening event for the country’s banking sector, and cyber risk disclosures as a legitimacy-regaining strategy. Mazumder and Sobhan hypothesize the positive spillover of the Bangladesh Bank heist on overall cyber risk disclosures of the banking sector in Bangladesh, while also proposing that the legitimacy-seeking motives differ depending on the nature of the bank. Analyzing cyber risk disclosure using the automated content analysis method, Mazumder and Sobhan find that banks’ cyber risk disclosures increased significantly after the heist. Moreover, they find that the political embeddedness of and adherence to Islamic Shariah by banks negatively influenced a bank’s tendency to use cyber risk disclosures as a legitimacy-regaining strategy after the heist. The novel contribution this paper makes is the demonstration that the spillover effect of legitimacy-threatening cyber theft is positive overall but differs between types of firms in a setting where there is a lack of cyber risk disclosure regulations and reporting frameworks.
It is impractical to expect a special issue to incorporate papers covering all the dimensions of financial fraud activities and their implications for financial stability. The papers in this special issue (along with much academic research and many practical implications in this arena) share the concern that financial fraud activities could be damaging for organizational and global financial stability. However, beyond the repetition of the importance of transparency (Mazumder and Sobhan 2021; Rossi et al 2021) and internal checks and balances (Lois et al 2020), most studies are silent on mechanisms to combat financial fraud activities and their effects on financial stability. Hence, the tasks of advancing research into financial fraud activities and understanding their causes and consequences (Shams 2016a) are as yet an embryonic area when it comes to conceptualizing sustainable financial stability (Shams 2016b). There are several empirical issues we believe are missing from the academic literature in this research domain that are paramount to address if we are going to advance the study of financial fraud activities in cross-disciplinary research. In this context and in support of the arguments in the literature, we propose four main relatively under-researched but important areas as future research directions of this field, to underpin our knowledge and practice of financial fraud risk management.
First, the banking and finance sector has historically been and remains imperilled with fraud and financial scandal (Toms 2019). Several themes for such high exposure to fraud and financial scandal have been identified in the literature. One recurring theme is the performance-based management incentive scheme (Beyer et al 2013) and ownership concentration (Toms 2019; Coffee 2005). Some anecdotal evidence suggests that the financial sector represents a very different cultural environment than academia, characterized by a strong “code of silence” and extremely weak job security (Luyendijk 2015). However, the impact of this cultural environment on financial fraud activities has not been systematically investigated in the academic literature. Hence, cross-sectoral comparative studies on organizational culture and its effect on financial risks and crimes would enhance our understanding, as well as help policy makers to detect and prevent such financial fraud activities. Other frequently cited reasons are the failure of corporate governance, including external audit and associated authorities responsible for regulating statutory auditors (Coffee 2001, 2003; Soltani 2014). While many of these reforms fail and commentators have accused policy makers of taking a reactive approach based on neoliberal financial governance ideology (Coffee 2019), more severe audit reforms are proposed in the United Kingdom, including a change in the responsibility and authority of the audit regulators. Therefore, investigation of the effectiveness of the new policy measure could be one fruitful area of further research.
Second, several studies have used “irrational exuberance” (Schiller 2000, p. 66) to understand the characteristics of the victims that motivated them to invest in Ponzi schemes (see, for example, Amoah 2018; Ullah et al 2020). These studies reveal that investors’ biases, their affinity with the founder or the first group of investors in the Ponzi scheme, their investment knowledge, their understanding of Ponzi schemes and their educational level all significantly affect the chances of an investor being (or not being) the victim of a Ponzi scheme. The literature also provides some evidence on how the perpetrators of Ponzi schemes capitalize on the psychological dispositions and projection biases involved in human decision-making to build affinity and maintain trust with the victims (Frankel 2012; Perri and Brody 2012). However, with the exception of Hofstetter et al (2018), academic studies on the effect of the rise and fall of Ponzi schemes focus on the financial system and stability. Hofstetter et al (2018) demonstrated the impact of the rise and fall of two Ponzi schemes on the personal credit standing of the victims as well as deposits in the financial sector in the municipalities most affected by the schemes. Future studies on how high-profile Ponzi schemes affect the financial wellbeing of the victims as well as the health of global financial systems would help to raise awareness of the victims by international policy makers. Besides, social media, which facilitates social interaction between Ponzi scheme founders and investors, is becoming a breeding ground for Ponzi schemes. However, except for Rantala (2019), academic studies on how social media facilitates the diffusion of the investment idea and contributes to the growth and survival of the socially spreading Ponzi scheme remain nonexistent. Further studies could help us better understand whether particular types of social media facilitate the social interaction between Ponzi scammers and their victims more than others. Such research would be helpful to strengthen the security of these particular social media platforms to monitor the behaviors of potential predators of the Ponzi schemes and potentially assist in preventing such crimes.
Third, cryptocurrencies such as Bitcoin (Brito and Castillo 2013) and, more recently, Ethereum and Monero have offered new opportunities to many perpetrators of financial crimes. Fraudsters can easily take advantage of the anonymity and decentralization offered by cryptocurrencies, and thus easily mask their identity (Bartoletti et al 2020). Moreover, there is an absence of consistent regulations across jurisdictions, making it challenging for law enforcement agencies to unmask cross-border cryptocurrency-based Ponzi schemes (Zetzsche et al 2017). The vulnerability of cryptocurrencies and their associated regulatory challenges offer enormous opportunities for cross-disciplinary research to understand the security measures of cryptocurrencies, and thereby formulate and enforce regulations to better detect and prevent cryptocurrency-based Ponzi schemes and other financial crimes.
Fourth, we argue that there is a lack of comparative studies in all areas of financial fraud activities across times and sociopolitical environments (Reurink 2018). Toms (2019) offers an excellent historical overview of the financial fraud incidences from around 1720 to 2009, and relates the increase in incidences to the advent of “complex group structures and international capital mobility, and mediated by managerial incentives and ownership concentration” (Toms 2019, p. 477). However, Toms only focuses on financial fraud incidences in the United Kingdom and the United States. Future studies could uncover the occurrence of these types of incidences in other countries and explore whether the relationship evidenced by Toms (2019) is also applicable in those countries. Other country- or region-specific studies could in future offer us novel insights for financial fraud management. Similarly, future studies could build on existing work in the field of comparative political economy and the literature in law and finance (see, for example, La Porta et al 1998; Coffee 2005; Deakin et al 2017) to understand the causal mechanisms between sociopolitical constructs and the incidence and types of fraudulent financial activities. If these causal mechanisms can be understood, this will help regulators and policy makers to reform some institutional elements to detect and prevent such crimes.
This special issue aims to inspire researchers and practitioners working on financial fraud risk management, and investigates the issues associated with financial fraud risks and how they are affected by sociopolitical context as well as how they impact on financial stability. Toward this end, we have brought together three papers examining three different issues relating to financial fraud risks. These papers include insights, tools and strategies for the detection and prevention of financial fraud activities. In addition to these contributions to the literature, we are eager to push the case for multidisciplinary research to develop and build on the existing literature to expand our understanding of the management of financial fraud risks and the greater role that this can play in maintaining financial stability. We hope that in this respect some of the heuristics and literature identified here as future research directions will support researchers and practitioners to realize this aspiration.
The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the paper.