Solving the data challenge: technical solutions for optimizing risk management, capital resources and liquidity

Since the financial crisis that began in 2007-08, regulatory pressure on capital adequacy, liquidity, funding, balance sheet size and leverage requirements has become increasingly intense. As a result, financial institutions must manage their meager financial resources increasingly wisely, explains Emmanuel Danzin of Opensee.

Emmanuel Danzin, Opensee

For capital markets traders and risk managers, this means having a thorough understanding of risk and how resources are used so that trading opportunities are optimized under a strict set of constraints and comply with the limits of risk and overall strategy. For treasury functions, this requires mastering future cash flow projections from multiple scenarios and angles, and managing liquidity risk with minimum margin of error and cost. Either way, offices need to manage large datasets by performing nonlinear aggregations and similarly exploring at the most granular level, all of which needs to be done quickly and with autonomy for business users.

In recent years, technologies have been used to take advantage of faster hardware, such as in memory (RAM) downloads or full re-evaluations based on graphics processing units, or to use smart shortcuts, such as pre-aggregations or machine learning to analyze historical data usage and pre-index data sets in result. However, significant challenges remain to be able to quickly and consistently deliver the full richness of data sets to business users without excessive infrastructure costs.

The problem of non-linearity

Essentially, banks have to deal with massive amounts of data because the aggregations that need to be calculated are not linear. The marginal impacts of changes – for example, new transactions or changes in projected cash flows – require complex re-aggregation with the rest of a portfolio as different netting methodologies are applied.

In the capital markets space, risks and the resources used are monitored through long-term projections, risk-weighted assets, sensitivities and cross-valuation adjustments taking into account the exposure to credit, the impact of financing and the use of capital, among others, as well as a range of risk measures. To determine the footprint of transactions and the use that is made of resources deemed scarce, “what if” simulations are used to assess these marginal impacts using non-linear aggregations. These can be done before or after the transaction, as follows:

Pre-trade calculations
Pre-trade calculations measure the differential impacts of new trades for traders to assess their relevance to the office in terms of strategy and risk. Here speed is the key as traders need to be able to quickly calculate these marginal impacts on all resources to decide whether or not to trade and with what economy, while minimizing the margins of error which are a source of expensive pads.

Post-trade impacts
Post-trade impacts should also be measured and assessed keeping in mind potential mitigation measures, such as portfolio shuffling, netting, cutbacks, trade restructuring, and synthetic offloads. Speed ​​is less of an issue in these cases, but the multiple scenarios that need to be considered result in huge volumes of data.

Asset-liability management (ALM)/Treasury
In the ALM/ cash space, aggregations are also non-linear and diverse as they require compliance with a wide range of clearing rules, such as accounting standards – International Financial Reporting Standards and we Generally accepted accounting principles, for example – leverage ratio and taxation, as well as any additional constraints that apply to global systemically important banks. These calculations – for example, liquidity metrics like Net Stable Funding Ratio and Liquidity Coverage Ratio – need to be performed on the fly for multiple cash flow projection scenarios, each with granular data to keep full. the richness of the data set. Ideally, treasurers want to manage liquidity with precision to cover different scenarios with precision and with lower hedging costs. With a granular and rich data set, future liquidity positions can be simulated, then aggregated and investigated with increased precision. Access to a full range of data sets can also open up a plethora of new possibilities – for example, applying machine learning on a historical basis to better predict future behaviors.

Speed ​​and autonomy

To effectively visualize, navigate and report data, user agility is a prerequisite. Ultimately, speed and battery life allow business users to better understand data and focus on the most salient data points. Big data analytics solutions like Opensee have been designed to enable value for money with speed and user autonomy. What’s new is that these operations can now be performed without compromising volumes, meaning that full granularity of the dataset can be maintained. Underpinning such a breakthrough is the new ability to take advantage of horizontal drive scalability, delivering virtually unlimited capacity on low-cost infrastructure, while maintaining RAM-like speeds.

When it comes to optimizing resources, rather than trying to optimize different business data sources in a piecemeal fashion – risking increasing utilization of one key resource while trying to reduce another – several datasets can now be centralized into one, or central datasets can be further enriched by adding a secondary set. This centralization of data around risk measures, for example, information on profit and loss, balance sheet consumption and use of collateral, allows business users to aggregate, manipulate, analyze, simulate and visualize business data with a simultaneous and complete view of the impacts on all dimensions. This means that the entire “utility function” can be optimized, rather than just one metric at a time. Such improved data capacity truly represents a revolutionary change that ultimately allows banks to optimize their resources much more efficiently.

Adjust resources with speed and agility

Faced with strict regulatory constraints and a challenging market environment, banks must adapt by leveraging new technologies and solutions to optimally allocate their resources, running multidimensional scenarios on their complete, granular data sets. The era of running optimization scenarios on a manual and intuitive basis is coming to an end. On the contrary, financial institutions that adopt innovative Big Data solutions are finally able to refine their resources with speed and agility to their advantage.


Source link

Comments are closed.