Blog

Pushing for Industry Standardization of “The Machine”

By Donal Gallagher, Co-Head – Acadia Quant Services

For the global finance industry, changes in the business environment have created tremendous opportunities and unique challenges. The advent and broad adoption of cloud computing technology has allowed for data to be captured more easily than in the past. And the rapid increase in computing power combined with the reduction in cost has made processing vast amounts of information possible in ways it had not been previously. What’s more, the acceptance of information security standards has allowed businesses the opportunity to outsource functions with confidence. This confluence of trends is a prime opportunity for financial institutions to derive actionable insights from their data and add value to their organizations.

And yet, for financial institutions that trade derivatives, the current process of getting the reams of necessary data into ‘the machine’ remains a cumbersome task that requires vast resources and institutional knowledge gained from years of experience in specific, bespoke derivative calculations. However, in the face of a global push for more centralized, industry standard reconciliation and calculations, it’s the perfect time for firms to consider the impact of grinding the machine.

What is “The Machine”

It is a way by which an organization takes the trades that generate risk, processes them and provides numbers that can be analyzed to gain a sense of the risk across the institution. In every single financial institution, the process is incredibly complex; in some banks, it takes feeds from upwards of 40 different systems. This function entails highly qualified people spending an inordinate amount of their time developing, maintaining, and supporting it. These complicated calculations produce terabytes of information and boil them down into what is essentially a spreadsheet. Only at that point does the process of analyzing the information for decision making begin. This function has been built over time to the point where it is so nuanced that working on it cannot easily be transferred to more junior staff or outsourced. Taking a step back, this process begs to be centralized and done once in a utility, freeing people’s time to use the outputs to make better risk management decisions.

The opportunity from industry standardization

There is a strong push for these calculations of risk to become more standardized, with regulators taking the position that these overly complicated, bespoke machines for each individual institution are a risk unto themselves. This has created an opportunity to develop processes that calculate risk centrally, with one utility serving as an engine for the whole industry.

Acadia is playing a leading role in the move towards standardization, using our vast processing power to help develop an industry-wide, centralized engine and working with firms to use those calculations. And from an information security standpoint, we’re at a point where firms trust that their data and information will be protected and maintained to the highest world-class standards. As the regulatory shift continues, our thought leadership borne from years of practical experience puts us in the perfect position to help.

Preparing staff for the future of risk analysis

Businesses gain real analytical experience in risk when things go wrong. Allowing for more time to understand previous industry failures — how things went wrong in the past and where they could go wrong today and, in the future — is how you develop institutional knowledge on risk. If financial institutions are over allocating staff resources towards crunching numbers, not enough people are developing that knowledge of risk and acting upon it. A good example is the way the engineering and aviation industries develop and train talent. By highlighting previous real-world mistakes as case studies, we can gain a deeper understanding of exposure to risk and how to effectively manage it across the trade lifecycle.

In practice, as the creation of a centralized utility frees people up to do these more productive tasks, they need the ability for robust analysis. Risk management teams have an opportunity to expand their skill sets to incorporate more data analytics and data science and further into the future, machine learning.

As less time is spent on manual intervention, banks and financial institutions will find themselves able to focus more on capital allocation against risk more efficiently. That multi-dimensional work will prove to be challenging, but ultimately far more valuable.

About Donal Gallagher

Donal Gallagher is President of Acadia’s Quaternion Expert Services and Co-Head of its QuantitativeServices division. Prior to Acadia’s acquisition in 2021, Donal was Co-Founder and CEO of Quaternion Risk Management. As a senior financial engineer and risk management professional, Donal acts as an advisor to Chief Risk Officers and Chief Financial Officers, focusing on the transparent pricing and risk management of complex portfolios of financial instruments. Donal complements this advisory work with the development and implementation of sophisticated pricing software. Donal holds a Ph.D. from the California Institute of Technology in Applied Mathematics and an M.Sc. in Mathematical Physics from the National University ofIreland (University College Dublin). Donal is co-author of “Modern Derivatives Pricing and Credit ExposureAnalysis”, Palgrave Macmillan, 2015.

For more information please visit us at acadia.inc Or email us at info@acadia.inc

Recent Videos

Video

ISDA SIMM 2.7: Challenges and Opportunities

October 29, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

The Evolution of Model Validation: Why Governance is Essential

September 19, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

Building a Comprehensive Market Data Utility for Risk and Pricing Calculations

September 19, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Article

Capital Calculation: An analytical comparison of Standardized vs. Model-Based Approaches

September 13, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Recent Videos

Video

ISDA SIMM 2.7: Challenges and Opportunities

October 29, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

The Evolution of Model Validation: Why Governance is Essential

September 19, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

Building a Comprehensive Market Data Utility for Risk and Pricing Calculations

September 20, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Article

Capital Calculation: An analytical comparison of Standardized vs. Model-Based Approaches

September 17, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

By Donal Gallagher, Co-Head – Acadia Quant Services

For the global finance industry, changes in the business environment have created tremendous opportunities and unique challenges. The advent and broad adoption of cloud computing technology has allowed for data to be captured more easily than in the past. And the rapid increase in computing power combined with the reduction in cost has made processing vast amounts of information possible in ways it had not been previously. What’s more, the acceptance of information security standards has allowed businesses the opportunity to outsource functions with confidence. This confluence of trends is a prime opportunity for financial institutions to derive actionable insights from their data and add value to their organizations.

And yet, for financial institutions that trade derivatives, the current process of getting the reams of necessary data into ‘the machine’ remains a cumbersome task that requires vast resources and institutional knowledge gained from years of experience in specific, bespoke derivative calculations. However, in the face of a global push for more centralized, industry standard reconciliation and calculations, it’s the perfect time for firms to consider the impact of grinding the machine.

What is “The Machine”

It is a way by which an organization takes the trades that generate risk, processes them and provides numbers that can be analyzed to gain a sense of the risk across the institution. In every single financial institution, the process is incredibly complex; in some banks, it takes feeds from upwards of 40 different systems. This function entails highly qualified people spending an inordinate amount of their time developing, maintaining, and supporting it. These complicated calculations produce terabytes of information and boil them down into what is essentially a spreadsheet. Only at that point does the process of analyzing the information for decision making begin. This function has been built over time to the point where it is so nuanced that working on it cannot easily be transferred to more junior staff or outsourced. Taking a step back, this process begs to be centralized and done once in a utility, freeing people’s time to use the outputs to make better risk management decisions.

The opportunity from industry standardization

There is a strong push for these calculations of risk to become more standardized, with regulators taking the position that these overly complicated, bespoke machines for each individual institution are a risk unto themselves. This has created an opportunity to develop processes that calculate risk centrally, with one utility serving as an engine for the whole industry.

Acadia is playing a leading role in the move towards standardization, using our vast processing power to help develop an industry-wide, centralized engine and working with firms to use those calculations. And from an information security standpoint, we’re at a point where firms trust that their data and information will be protected and maintained to the highest world-class standards. As the regulatory shift continues, our thought leadership borne from years of practical experience puts us in the perfect position to help.

Preparing staff for the future of risk analysis

Businesses gain real analytical experience in risk when things go wrong. Allowing for more time to understand previous industry failures — how things went wrong in the past and where they could go wrong today and, in the future — is how you develop institutional knowledge on risk. If financial institutions are over allocating staff resources towards crunching numbers, not enough people are developing that knowledge of risk and acting upon it. A good example is the way the engineering and aviation industries develop and train talent. By highlighting previous real-world mistakes as case studies, we can gain a deeper understanding of exposure to risk and how to effectively manage it across the trade lifecycle.

In practice, as the creation of a centralized utility frees people up to do these more productive tasks, they need the ability for robust analysis. Risk management teams have an opportunity to expand their skill sets to incorporate more data analytics and data science and further into the future, machine learning.

As less time is spent on manual intervention, banks and financial institutions will find themselves able to focus more on capital allocation against risk more efficiently. That multi-dimensional work will prove to be challenging, but ultimately far more valuable.

About Donal Gallagher

Donal Gallagher is President of Acadia’s Quaternion Expert Services and Co-Head of its QuantitativeServices division. Prior to Acadia’s acquisition in 2021, Donal was Co-Founder and CEO of Quaternion Risk Management. As a senior financial engineer and risk management professional, Donal acts as an advisor to Chief Risk Officers and Chief Financial Officers, focusing on the transparent pricing and risk management of complex portfolios of financial instruments. Donal complements this advisory work with the development and implementation of sophisticated pricing software. Donal holds a Ph.D. from the California Institute of Technology in Applied Mathematics and an M.Sc. in Mathematical Physics from the National University ofIreland (University College Dublin). Donal is co-author of “Modern Derivatives Pricing and Credit ExposureAnalysis”, Palgrave Macmillan, 2015.

For more information please visit us at acadia.inc Or email us at info@acadia.inc

By Donal Gallagher, Co-Head – Acadia Quant Services

For the global finance industry, changes in the business environment have created tremendous opportunities and unique challenges. The advent and broad adoption of cloud computing technology has allowed for data to be captured more easily than in the past. And the rapid increase in computing power combined with the reduction in cost has made processing vast amounts of information possible in ways it had not been previously. What’s more, the acceptance of information security standards has allowed businesses the opportunity to outsource functions with confidence. This confluence of trends is a prime opportunity for financial institutions to derive actionable insights from their data and add value to their organizations.

And yet, for financial institutions that trade derivatives, the current process of getting the reams of necessary data into ‘the machine’ remains a cumbersome task that requires vast resources and institutional knowledge gained from years of experience in specific, bespoke derivative calculations. However, in the face of a global push for more centralized, industry standard reconciliation and calculations, it’s the perfect time for firms to consider the impact of grinding the machine.

What is “The Machine”

It is a way by which an organization takes the trades that generate risk, processes them and provides numbers that can be analyzed to gain a sense of the risk across the institution. In every single financial institution, the process is incredibly complex; in some banks, it takes feeds from upwards of 40 different systems. This function entails highly qualified people spending an inordinate amount of their time developing, maintaining, and supporting it. These complicated calculations produce terabytes of information and boil them down into what is essentially a spreadsheet. Only at that point does the process of analyzing the information for decision making begin. This function has been built over time to the point where it is so nuanced that working on it cannot easily be transferred to more junior staff or outsourced. Taking a step back, this process begs to be centralized and done once in a utility, freeing people’s time to use the outputs to make better risk management decisions.

The opportunity from industry standardization

There is a strong push for these calculations of risk to become more standardized, with regulators taking the position that these overly complicated, bespoke machines for each individual institution are a risk unto themselves. This has created an opportunity to develop processes that calculate risk centrally, with one utility serving as an engine for the whole industry.

Acadia is playing a leading role in the move towards standardization, using our vast processing power to help develop an industry-wide, centralized engine and working with firms to use those calculations. And from an information security standpoint, we’re at a point where firms trust that their data and information will be protected and maintained to the highest world-class standards. As the regulatory shift continues, our thought leadership borne from years of practical experience puts us in the perfect position to help.

Preparing staff for the future of risk analysis

Businesses gain real analytical experience in risk when things go wrong. Allowing for more time to understand previous industry failures — how things went wrong in the past and where they could go wrong today and, in the future — is how you develop institutional knowledge on risk. If financial institutions are over allocating staff resources towards crunching numbers, not enough people are developing that knowledge of risk and acting upon it. A good example is the way the engineering and aviation industries develop and train talent. By highlighting previous real-world mistakes as case studies, we can gain a deeper understanding of exposure to risk and how to effectively manage it across the trade lifecycle.

In practice, as the creation of a centralized utility frees people up to do these more productive tasks, they need the ability for robust analysis. Risk management teams have an opportunity to expand their skill sets to incorporate more data analytics and data science and further into the future, machine learning.

As less time is spent on manual intervention, banks and financial institutions will find themselves able to focus more on capital allocation against risk more efficiently. That multi-dimensional work will prove to be challenging, but ultimately far more valuable.

About Donal Gallagher

Donal Gallagher is President of Acadia’s Quaternion Expert Services and Co-Head of its QuantitativeServices division. Prior to Acadia’s acquisition in 2021, Donal was Co-Founder and CEO of Quaternion Risk Management. As a senior financial engineer and risk management professional, Donal acts as an advisor to Chief Risk Officers and Chief Financial Officers, focusing on the transparent pricing and risk management of complex portfolios of financial instruments. Donal complements this advisory work with the development and implementation of sophisticated pricing software. Donal holds a Ph.D. from the California Institute of Technology in Applied Mathematics and an M.Sc. in Mathematical Physics from the National University ofIreland (University College Dublin). Donal is co-author of “Modern Derivatives Pricing and Credit ExposureAnalysis”, Palgrave Macmillan, 2015.

For more information please visit us at acadia.inc Or email us at info@acadia.inc

Read More here

Share this

Explore our video library

View all our videos >

Explore our video library

View all our videos >

Recent Insights

Video

ISDA SIMM 2.7: Challenges and Opportunities

October 29, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

Building a Comprehensive Market Data Utility for Risk and Pricing Calculations

September 19, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

The Evolution of Model Validation: Why Governance is Essential

September 19, 2024

Read Now>
Read Now>
Watch Now>
Watch Now>

Video

ISDA SIMM 2.7: Challenges and Opportunities

October 29, 2024

Read Now>
Read Now>
Learn more >
Watch Now>

Video

Building a Comprehensive Market Data Utility for Risk and Pricing Calculations

September 19, 2024

Read Now>
Read Now>
Learn more >
Watch Now>

Video

The Evolution of Model Validation: Why Governance is Essential

September 19, 2024

Read Now>
Read Now>
Learn more >
Watch Now>
ISDA SIMM 2.7: Challenges and Opportunities
October 29, 2024
Learn more >
Building a Comprehensive Market Data Utility for Risk and Pricing Calculations
September 19, 2024
Learn more >
The Evolution of Model Validation: Why Governance is Essential
September 19, 2024
Learn more >
ISDA SIMM 2.7: Challenges and Opportunities
October 29, 2024
Learn more >
Building a Comprehensive Market Data Utility for Risk and Pricing Calculations
September 20, 2024
Learn more >
The Evolution of Model Validation: Why Governance is Essential
September 19, 2024
Learn more >