Empirically, the prevailing market prices for liquidity tokens of the constant product market maker (CPMM) -- as offered in practice by companies such as Uniswap -- readily permit arbitrage opportunities by delta hedging the risk of the position. Herein, we investigate this arbitrage opportunity by treating the liquidity token as a derivative position in the prices of the underlying assets for the CPMM. In doing so, not dissimilar to the Black-Scholes result, we deduce risk-neutral pricing and hedging formulas for these liquidity tokens. Furthermore, with our novel pricing formula, we construct a method to calibrate a volatility to data which provides an updated (non-market) price which would not permit arbitrage if quoted by the CPMM. We conclude with a discussion of novel AMM designs which would bring the pricing of liquidity tokens into the modern financial era.
{"title":"DeFi Arbitrage in Hedged Liquidity Tokens","authors":"Maxim Bichuch, Zachary Feinstein","doi":"arxiv-2409.11339","DOIUrl":"https://doi.org/arxiv-2409.11339","url":null,"abstract":"Empirically, the prevailing market prices for liquidity tokens of the\u0000constant product market maker (CPMM) -- as offered in practice by companies\u0000such as Uniswap -- readily permit arbitrage opportunities by delta hedging the\u0000risk of the position. Herein, we investigate this arbitrage opportunity by\u0000treating the liquidity token as a derivative position in the prices of the\u0000underlying assets for the CPMM. In doing so, not dissimilar to the\u0000Black-Scholes result, we deduce risk-neutral pricing and hedging formulas for\u0000these liquidity tokens. Furthermore, with our novel pricing formula, we\u0000construct a method to calibrate a volatility to data which provides an updated\u0000(non-market) price which would not permit arbitrage if quoted by the CPMM. We\u0000conclude with a discussion of novel AMM designs which would bring the pricing\u0000of liquidity tokens into the modern financial era.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"49 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142265655","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Atithi Acharya, Romina Yalovetzky, Pierre Minssen, Shouvanik Chakrabarti, Ruslan Shaydulin, Rudy Raymond, Yue Sun, Dylan Herman, Ruben S. Andrist, Grant Salton, Martin J. A. Schuetz, Helmut G. Katzgraber, Marco Pistoia
Industrially relevant constrained optimization problems, such as portfolio optimization and portfolio rebalancing, are often intractable or difficult to solve exactly. In this work, we propose and benchmark a decomposition pipeline targeting portfolio optimization and rebalancing problems with constraints. The pipeline decomposes the optimization problem into constrained subproblems, which are then solved separately and aggregated to give a final result. Our pipeline includes three main components: preprocessing of correlation matrices based on random matrix theory, modified spectral clustering based on Newman's algorithm, and risk rebalancing. Our empirical results show that our pipeline consistently decomposes real-world portfolio optimization problems into subproblems with a size reduction of approximately 80%. Since subproblems are then solved independently, our pipeline drastically reduces the total computation time for state-of-the-art solvers. Moreover, by decomposing large problems into several smaller subproblems, the pipeline enables the use of near-term quantum devices as solvers, providing a path toward practical utility of quantum computers in portfolio optimization.
{"title":"Decomposition Pipeline for Large-Scale Portfolio Optimization with Applications to Near-Term Quantum Computing","authors":"Atithi Acharya, Romina Yalovetzky, Pierre Minssen, Shouvanik Chakrabarti, Ruslan Shaydulin, Rudy Raymond, Yue Sun, Dylan Herman, Ruben S. Andrist, Grant Salton, Martin J. A. Schuetz, Helmut G. Katzgraber, Marco Pistoia","doi":"arxiv-2409.10301","DOIUrl":"https://doi.org/arxiv-2409.10301","url":null,"abstract":"Industrially relevant constrained optimization problems, such as portfolio\u0000optimization and portfolio rebalancing, are often intractable or difficult to\u0000solve exactly. In this work, we propose and benchmark a decomposition pipeline\u0000targeting portfolio optimization and rebalancing problems with constraints. The\u0000pipeline decomposes the optimization problem into constrained subproblems,\u0000which are then solved separately and aggregated to give a final result. Our\u0000pipeline includes three main components: preprocessing of correlation matrices\u0000based on random matrix theory, modified spectral clustering based on Newman's\u0000algorithm, and risk rebalancing. Our empirical results show that our pipeline\u0000consistently decomposes real-world portfolio optimization problems into\u0000subproblems with a size reduction of approximately 80%. Since subproblems are\u0000then solved independently, our pipeline drastically reduces the total\u0000computation time for state-of-the-art solvers. Moreover, by decomposing large\u0000problems into several smaller subproblems, the pipeline enables the use of\u0000near-term quantum devices as solvers, providing a path toward practical utility\u0000of quantum computers in portfolio optimization.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"2 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142265653","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In the financial field of the United States, the application of big data technology has become one of the important means for financial institutions to enhance competitiveness and reduce risks. The core objective of this article is to explore how to fully utilize big data technology to achieve complete integration of internal and external data of financial institutions, and create an efficient and reliable platform for big data collection, storage, and analysis. With the continuous expansion and innovation of financial business, traditional risk management models are no longer able to meet the increasingly complex market demands. This article adopts big data mining and real-time streaming data processing technology to monitor, analyze, and alert various business data. Through statistical analysis of historical data and precise mining of customer transaction behavior and relationships, potential risks can be more accurately identified and timely responses can be made. This article designs and implements a financial big data intelligent risk control platform. This platform not only achieves effective integration, storage, and analysis of internal and external data of financial institutions, but also intelligently displays customer characteristics and their related relationships, as well as intelligent supervision of various risk information
{"title":"Research and Design of a Financial Intelligent Risk Control Platform Based on Big Data Analysis and Deep Machine Learning","authors":"Shuochen Bi, Yufan Lian, Ziyue Wang","doi":"arxiv-2409.10331","DOIUrl":"https://doi.org/arxiv-2409.10331","url":null,"abstract":"In the financial field of the United States, the application of big data\u0000technology has become one of the important means for financial institutions to\u0000enhance competitiveness and reduce risks. The core objective of this article is\u0000to explore how to fully utilize big data technology to achieve complete\u0000integration of internal and external data of financial institutions, and create\u0000an efficient and reliable platform for big data collection, storage, and\u0000analysis. With the continuous expansion and innovation of financial business,\u0000traditional risk management models are no longer able to meet the increasingly\u0000complex market demands. This article adopts big data mining and real-time\u0000streaming data processing technology to monitor, analyze, and alert various\u0000business data. Through statistical analysis of historical data and precise\u0000mining of customer transaction behavior and relationships, potential risks can\u0000be more accurately identified and timely responses can be made. This article\u0000designs and implements a financial big data intelligent risk control platform.\u0000This platform not only achieves effective integration, storage, and analysis of\u0000internal and external data of financial institutions, but also intelligently\u0000displays customer characteristics and their related relationships, as well as\u0000intelligent supervision of various risk information","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"97 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142265654","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This paper introduces a novel stochastic model for credit spreads. The stochastic approach leverages the diffusion of default intensities via a CIR++ model and is formulated within a risk-neutral probability space. Our research primarily addresses two gaps in the literature. The first is the lack of credit spread models founded on a stochastic basis that enables continuous modeling, as many existing models rely on factorial assumptions. The second is the limited availability of models that directly yield a term structure of credit spreads. An intermediate result of our model is the provision of a term structure for the prices of defaultable bonds. We present the model alongside an innovative, practical, and conservative calibration approach that minimizes the error between historical and theoretical volatilities of default intensities. We demonstrate the robustness of both the model and its calibration process by comparing its behavior to historical credit spread values. Our findings indicate that the model not only produces realistic credit spread term structure curves but also exhibits consistent diffusion over time. Additionally, the model accurately fits the initial term structure of implied survival probabilities and provides an analytical expression for the credit spread of any given maturity at any future time.
{"title":"Credit Spreads' Term Structure: Stochastic Modeling with CIR++ Intensity","authors":"Mohamed Ben Alaya, Ahmed Kebaier, Djibril Sarr","doi":"arxiv-2409.09179","DOIUrl":"https://doi.org/arxiv-2409.09179","url":null,"abstract":"This paper introduces a novel stochastic model for credit spreads. The\u0000stochastic approach leverages the diffusion of default intensities via a CIR++\u0000model and is formulated within a risk-neutral probability space. Our research\u0000primarily addresses two gaps in the literature. The first is the lack of credit\u0000spread models founded on a stochastic basis that enables continuous modeling,\u0000as many existing models rely on factorial assumptions. The second is the\u0000limited availability of models that directly yield a term structure of credit\u0000spreads. An intermediate result of our model is the provision of a term\u0000structure for the prices of defaultable bonds. We present the model alongside\u0000an innovative, practical, and conservative calibration approach that minimizes\u0000the error between historical and theoretical volatilities of default\u0000intensities. We demonstrate the robustness of both the model and its\u0000calibration process by comparing its behavior to historical credit spread\u0000values. Our findings indicate that the model not only produces realistic credit\u0000spread term structure curves but also exhibits consistent diffusion over time.\u0000Additionally, the model accurately fits the initial term structure of implied\u0000survival probabilities and provides an analytical expression for the credit\u0000spread of any given maturity at any future time.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"32 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142265656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Random delays between the occurrence of accident events and the corresponding reporting times of insurance claims is a standard feature of insurance data. The time lag between the reporting and the processing of a claim depends on whether the claim can be processed without delay as it arrives or whether it remains unprocessed for some time because of temporarily insufficient processing capacity that is shared between all incoming claims. We aim to explain and analyze the nature of processing delays and build-up of backlogs. We show how to select processing capacity optimally in order to minimize claims costs, taking delay-adjusted costs and fixed costs for claims settlement capacity into account. Theoretical results are combined with a large-scale numerical study that demonstrates practical usefulness of our proposal.
{"title":"Claims processing and costs under capacity constraints","authors":"Filip Lindskog, Mario V. Wüthrich","doi":"arxiv-2409.09091","DOIUrl":"https://doi.org/arxiv-2409.09091","url":null,"abstract":"Random delays between the occurrence of accident events and the corresponding\u0000reporting times of insurance claims is a standard feature of insurance data.\u0000The time lag between the reporting and the processing of a claim depends on\u0000whether the claim can be processed without delay as it arrives or whether it\u0000remains unprocessed for some time because of temporarily insufficient\u0000processing capacity that is shared between all incoming claims. We aim to\u0000explain and analyze the nature of processing delays and build-up of backlogs.\u0000We show how to select processing capacity optimally in order to minimize claims\u0000costs, taking delay-adjusted costs and fixed costs for claims settlement\u0000capacity into account. Theoretical results are combined with a large-scale\u0000numerical study that demonstrates practical usefulness of our proposal.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"197 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142265442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
As the body of academic literature continues to grow, researchers face increasing difficulties in effectively searching for relevant resources. Existing databases and search engines often fall short of providing a comprehensive and contextually relevant collection of academic literature. To address this issue, we propose a novel framework that leverages Natural Language Processing (NLP) techniques. This framework automates the retrieval, summarization, and clustering of academic literature within a specific research domain. To demonstrate the effectiveness of our approach, we introduce CyLit, an NLP-powered repository specifically designed for the cyber risk literature. CyLit empowers researchers by providing access to context-specific resources and enabling the tracking of trends in the dynamic and rapidly evolving field of cyber risk. Through the automatic processing of large volumes of data, our NLP-powered solution significantly enhances the efficiency and specificity of academic literature searches. We compare the literature categorization results of CyLit to those presented in survey papers or generated by ChatGPT, highlighting the distinctive insights this tool provides into cyber risk research literature. Using NLP techniques, we aim to revolutionize the way researchers discover, analyze, and utilize academic resources, ultimately fostering advancements in various domains of knowledge.
{"title":"NLP-Powered Repository and Search Engine for Academic Papers: A Case Study on Cyber Risk Literature with CyLit","authors":"Linfeng Zhang, Changyue Hu, Zhiyu Quan","doi":"arxiv-2409.06226","DOIUrl":"https://doi.org/arxiv-2409.06226","url":null,"abstract":"As the body of academic literature continues to grow, researchers face\u0000increasing difficulties in effectively searching for relevant resources.\u0000Existing databases and search engines often fall short of providing a\u0000comprehensive and contextually relevant collection of academic literature. To\u0000address this issue, we propose a novel framework that leverages Natural\u0000Language Processing (NLP) techniques. This framework automates the retrieval,\u0000summarization, and clustering of academic literature within a specific research\u0000domain. To demonstrate the effectiveness of our approach, we introduce CyLit,\u0000an NLP-powered repository specifically designed for the cyber risk literature.\u0000CyLit empowers researchers by providing access to context-specific resources\u0000and enabling the tracking of trends in the dynamic and rapidly evolving field\u0000of cyber risk. Through the automatic processing of large volumes of data, our\u0000NLP-powered solution significantly enhances the efficiency and specificity of\u0000academic literature searches. We compare the literature categorization results\u0000of CyLit to those presented in survey papers or generated by ChatGPT,\u0000highlighting the distinctive insights this tool provides into cyber risk\u0000research literature. Using NLP techniques, we aim to revolutionize the way\u0000researchers discover, analyze, and utilize academic resources, ultimately\u0000fostering advancements in various domains of knowledge.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"2 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142191286","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The abstract theory of risk measures is well-developed for certain classes of solid subspaces of $L^{0}$. We provide an example to illustrate that this framework is insufficient to deal with the subtleties of incomplete markets. To remedy this problem, we consider risk measures on the subspace generated by a closed, absolutely convex, and bounded subset $Ksubset L^{0}$, which represents the attainable securities. In this context, we introduce the equicontinuous Fatou property. Under the existence of a certain topology $tau$ on $mathrm{span}(K)$, interpreted as a generalized weak-star topology, we obtain an equivalence between the equicontinuous Fatou property, and lower semicontinuity with respect to $tau$. As a corollary, we obtain tractable dual representations for such risk measures, which subsumes essentially all known results on weak-star representations of risk measures. This dual representation allows one to prove that all risk measures of this form extend, in a maximal way, to the ideal generated by $mathrm{span}(K)$ while preserving a Fatou-like property.
{"title":"Risk measures on incomplete markets: a new non-solid paradigm","authors":"Vasily Melnikov","doi":"arxiv-2409.05194","DOIUrl":"https://doi.org/arxiv-2409.05194","url":null,"abstract":"The abstract theory of risk measures is well-developed for certain classes of\u0000solid subspaces of $L^{0}$. We provide an example to illustrate that this\u0000framework is insufficient to deal with the subtleties of incomplete markets. To\u0000remedy this problem, we consider risk measures on the subspace generated by a\u0000closed, absolutely convex, and bounded subset $Ksubset L^{0}$, which\u0000represents the attainable securities. In this context, we introduce the\u0000equicontinuous Fatou property. Under the existence of a certain topology $tau$\u0000on $mathrm{span}(K)$, interpreted as a generalized weak-star topology, we\u0000obtain an equivalence between the equicontinuous Fatou property, and lower\u0000semicontinuity with respect to $tau$. As a corollary, we obtain tractable dual\u0000representations for such risk measures, which subsumes essentially all known\u0000results on weak-star representations of risk measures. This dual representation\u0000allows one to prove that all risk measures of this form extend, in a maximal\u0000way, to the ideal generated by $mathrm{span}(K)$ while preserving a Fatou-like\u0000property.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"38 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142191287","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We study Pareto optimality in a decentralized peer-to-peer risk-sharing market where agents' preferences are represented by robust distortion risk measures that are not necessarily convex. We obtain a characterization of Pareto-optimal allocations of the aggregate risk in the market, and we show that the shape of the allocations depends primarily on each agent's assessment of the tail of the aggregate risk. We quantify the latter via an index of probabilistic risk aversion, and we illustrate our results using concrete examples of popular families of distortion functions. As an application of our results, we revisit the market for flood risk insurance in the United States. We present the decentralized risk sharing arrangement as an alternative to the current centralized market structure, and we characterize the optimal allocations in a numerical study with historical flood data. We conclude with an in-depth discussion of the advantages and disadvantages of a decentralized insurance scheme in this setting.
{"title":"Pareto-Optimal Peer-to-Peer Risk Sharing with Robust Distortion Risk Measures","authors":"Mario Ghossoub, Michael B. Zhu, Wing Fung Chong","doi":"arxiv-2409.05103","DOIUrl":"https://doi.org/arxiv-2409.05103","url":null,"abstract":"We study Pareto optimality in a decentralized peer-to-peer risk-sharing\u0000market where agents' preferences are represented by robust distortion risk\u0000measures that are not necessarily convex. We obtain a characterization of\u0000Pareto-optimal allocations of the aggregate risk in the market, and we show\u0000that the shape of the allocations depends primarily on each agent's assessment\u0000of the tail of the aggregate risk. We quantify the latter via an index of\u0000probabilistic risk aversion, and we illustrate our results using concrete\u0000examples of popular families of distortion functions. As an application of our\u0000results, we revisit the market for flood risk insurance in the United States.\u0000We present the decentralized risk sharing arrangement as an alternative to the\u0000current centralized market structure, and we characterize the optimal\u0000allocations in a numerical study with historical flood data. We conclude with\u0000an in-depth discussion of the advantages and disadvantages of a decentralized\u0000insurance scheme in this setting.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"395 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142191293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
This technical report presents a stochastic framework for pricing temperature derivatives in Indian markets accounting for both monsoon and winter seasons. Utilising historical temperature and electricity consumption data from 12 Indian states we develop a model based on a modified mean-reverting Ornstein-Uhlenbeck process and employ Monte Carlo simulations for pricing. Our analysis reveals significant variations in option pricing across states with monsoon call options ranging from 10.78 USD to 182.82 USD and winter put options from 48.65 USD to 194.99 USD. The introduction of a risk aversion parameter shows substantial impacts on pricing leading to an increase of up to 416 percentage in option prices for certain states. Sensitivity analyses indicate that option prices are more responsive to changes in volatility than to mean reversion rates. Additionally extreme weather scenarios can shift option prices by up to 409 percentage during heatwaves and decrease by 60 percentage during cold waves. These findings emphasise the importance of state-specific and season-specific approaches in temperature derivative pricing highlighting the need for tailored risk management strategies in India's diverse climate.
{"title":"Quantifying Seasonal Weather Risk in Indian Markets: Stochastic Model for Risk-Averse State-Specific Temperature Derivative Pricing","authors":"Soumil Hooda, Shubham Sharma, Kunal Bansal","doi":"arxiv-2409.04541","DOIUrl":"https://doi.org/arxiv-2409.04541","url":null,"abstract":"This technical report presents a stochastic framework for pricing temperature\u0000derivatives in Indian markets accounting for both monsoon and winter seasons.\u0000Utilising historical temperature and electricity consumption data from 12\u0000Indian states we develop a model based on a modified mean-reverting\u0000Ornstein-Uhlenbeck process and employ Monte Carlo simulations for pricing. Our\u0000analysis reveals significant variations in option pricing across states with\u0000monsoon call options ranging from 10.78 USD to 182.82 USD and winter put\u0000options from 48.65 USD to 194.99 USD. The introduction of a risk aversion\u0000parameter shows substantial impacts on pricing leading to an increase of up to\u0000416 percentage in option prices for certain states. Sensitivity analyses\u0000indicate that option prices are more responsive to changes in volatility than\u0000to mean reversion rates. Additionally extreme weather scenarios can shift\u0000option prices by up to 409 percentage during heatwaves and decrease by 60\u0000percentage during cold waves. These findings emphasise the importance of\u0000state-specific and season-specific approaches in temperature derivative pricing\u0000highlighting the need for tailored risk management strategies in India's\u0000diverse climate.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"64 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142191289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Lukasz Szpruch, Marc Sabaté Vidales, Tanut Treetanthiploet, Yufei Zhang
We study the loan contracts offered by decentralised loan protocols (DLPs) through the lens of financial derivatives. DLPs, which effectively are clearinghouses, facilitate transactions between option buyers (i.e. borrowers) and option sellers (i.e. lenders). The loan-to-value at which the contract is initiated determines the option premium borrowers pay for entering the contract, and this can be deduced from the non-arbitrage pricing theory. We show that when there are no market frictions, and there is no spread between lending and borrowing rates, it is optimal to never enter the lending contract. Next, by accounting for the spread between rates and transactional costs, we develop a deep neural network-based algorithm for learning trading strategies on the external markets that allow us to replicate the payoff of the lending contracts that are not necessarily optimally exercised. This allows hedge the risk lenders carry by issuing options sold to the borrowers, which can complement (or even replace) the liquidations mechanism used to protect lenders' capital. Our approach can also be used to exploit (statistical) arbitrage opportunities that may arise when DLP allow users to enter lending contracts with loan-to-value, which is not appropriately calibrated to market conditions or/and when different markets price risk differently. We present thorough simulation experiments using historical data and simulations to validate our approach.
{"title":"Pricing and hedging of decentralised lending contracts","authors":"Lukasz Szpruch, Marc Sabaté Vidales, Tanut Treetanthiploet, Yufei Zhang","doi":"arxiv-2409.04233","DOIUrl":"https://doi.org/arxiv-2409.04233","url":null,"abstract":"We study the loan contracts offered by decentralised loan protocols (DLPs)\u0000through the lens of financial derivatives. DLPs, which effectively are\u0000clearinghouses, facilitate transactions between option buyers (i.e. borrowers)\u0000and option sellers (i.e. lenders). The loan-to-value at which the contract is\u0000initiated determines the option premium borrowers pay for entering the\u0000contract, and this can be deduced from the non-arbitrage pricing theory. We\u0000show that when there are no market frictions, and there is no spread between\u0000lending and borrowing rates, it is optimal to never enter the lending contract. Next, by accounting for the spread between rates and transactional costs, we\u0000develop a deep neural network-based algorithm for learning trading strategies\u0000on the external markets that allow us to replicate the payoff of the lending\u0000contracts that are not necessarily optimally exercised. This allows hedge the\u0000risk lenders carry by issuing options sold to the borrowers, which can\u0000complement (or even replace) the liquidations mechanism used to protect\u0000lenders' capital. Our approach can also be used to exploit (statistical)\u0000arbitrage opportunities that may arise when DLP allow users to enter lending\u0000contracts with loan-to-value, which is not appropriately calibrated to market\u0000conditions or/and when different markets price risk differently. We present\u0000thorough simulation experiments using historical data and simulations to\u0000validate our approach.","PeriodicalId":501128,"journal":{"name":"arXiv - QuantFin - Risk Management","volume":"75 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142191288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}