How predictable are personal income tax rates in the U.S., and does household spending respond to news about future taxes even before the rates change? To answer these questions, this paper uses novel historical high-frequency data of tax-exempt municipal bonds and develops a model of the term structure of municipal yield spreads to taxable bonds as a function of future top income tax rates and a risk premium. Testing the model using the presidential elections of 1980, 1992 and 2000 shows that financial markets forecast future tax reforms remarkably well in both the short and long run. Combining these market-based tax expectations or "tax news shocks'' with data from the Consumer Expenditure Survey shows strong evidence of anticipation effects to future tax changes among higher-income consumers, well before the tax rates change. Consumer spending changes about one-for-one with changes in expected lifetime tax liabilities. These findings imply that ignoring anticipation effects can substantially bias estimates of the total effect of a tax change.
{"title":"Tax News Shocks and Consumption","authors":"Lorenz Kueng","doi":"10.2139/ssrn.2746486","DOIUrl":"https://doi.org/10.2139/ssrn.2746486","url":null,"abstract":"How predictable are personal income tax rates in the U.S., and does household spending respond to news about future taxes even before the rates change? To answer these questions, this paper uses novel historical high-frequency data of tax-exempt municipal bonds and develops a model of the term structure of municipal yield spreads to taxable bonds as a function of future top income tax rates and a risk premium. Testing the model using the presidential elections of 1980, 1992 and 2000 shows that financial markets forecast future tax reforms remarkably well in both the short and long run. Combining these market-based tax expectations or \"tax news shocks'' with data from the Consumer Expenditure Survey shows strong evidence of anticipation effects to future tax changes among higher-income consumers, well before the tax rates change. Consumer spending changes about one-for-one with changes in expected lifetime tax liabilities. These findings imply that ignoring anticipation effects can substantially bias estimates of the total effect of a tax change.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113953909","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The credit risk measurement has affected significantly the lending performance of the commercial banks not only in Kenya but also in east Africa and has led to financial crises and poor lending performance. There has been a dramatic loss in the banking industry and suddenly announced large losses due to credit exposures that turned sour. The general objective of this study was to evaluate the influence of credit risk measurement on the lending performance of commercial banks in Nairobi County, Kenya. This study used descriptive survey research design and the target population for this study was at two levels. The target population was employees of the 42 commercial banks in operation in Kenya and the sample consisted of credit managers and other bankers where purposive sampling was used to pick 42 credit managers and simple random sampling was used to select the other 301 respondents. Data was collected using questionnaires and analyzed using descriptive statistics and logistic regression analysis (binary) was used. The results of the study revealed that credit risks measurements influenced bank lending performance positively. The study concluded that credit risk measurement activities significantly influence the lending performance of commercial banks and as a result the operating capital of commercial banks have gone down. The study recommended that Kenya government through the National Treasury and in collaboration with CBK and KBA should develop policies that will help the commercial banks optimize of credit risks measurement and improve the lending performance which is currently affected to great extent.
{"title":"Influence of Credit Risk Measurement on Lending Performance of Commercial Banks in Nairobi County, Kenya","authors":"John Karanja, J. Bichanga, G. Kingoriah","doi":"10.2139/ssrn.3234291","DOIUrl":"https://doi.org/10.2139/ssrn.3234291","url":null,"abstract":"The credit risk measurement has affected significantly the lending performance of the commercial banks not only in Kenya but also in east Africa and has led to financial crises and poor lending performance. There has been a dramatic loss in the banking industry and suddenly announced large losses due to credit exposures that turned sour. The general objective of this study was to evaluate the influence of credit risk measurement on the lending performance of commercial banks in Nairobi County, Kenya. This study used descriptive survey research design and the target population for this study was at two levels. The target population was employees of the 42 commercial banks in operation in Kenya and the sample consisted of credit managers and other bankers where purposive sampling was used to pick 42 credit managers and simple random sampling was used to select the other 301 respondents. Data was collected using questionnaires and analyzed using descriptive statistics and logistic regression analysis (binary) was used. The results of the study revealed that credit risks measurements influenced bank lending performance positively. The study concluded that credit risk measurement activities significantly influence the lending performance of commercial banks and as a result the operating capital of commercial banks have gone down. The study recommended that Kenya government through the National Treasury and in collaboration with CBK and KBA should develop policies that will help the commercial banks optimize of credit risks measurement and improve the lending performance which is currently affected to great extent.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130397464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
How do market prices adjust towards stability after a shock? Tracking individual stock prices following their dramatic shakeup after Donald Trump’s surprise election provides an answer. Prices moved overwhelmingly in the appropriate direction on the first post-election day, albeit much too little. Relative prices needed several daily iterations to converge. Three days of historically strong cross-sectional momentum were followed by a brief reversal. Prices then settled. Firm characteristics that explained first-day returns, such as corporate taxes and foreign revenues, accounted for most of the observed momentum. These findings support prominent theories of slow but predictable diffusion of information into prices.
{"title":"Paths to Convergence: Stock Price Adjustment After the Trump Election Shock","authors":"A. Wagner, R. Zeckhauser, Alexandre Ziegler","doi":"10.2139/ssrn.3037023","DOIUrl":"https://doi.org/10.2139/ssrn.3037023","url":null,"abstract":"How do market prices adjust towards stability after a shock? Tracking individual stock prices following their dramatic shakeup after Donald Trump’s surprise election provides an answer. Prices moved overwhelmingly in the appropriate direction on the first post-election day, albeit much too little. Relative prices needed several daily iterations to converge. Three days of historically strong cross-sectional momentum were followed by a brief reversal. Prices then settled. Firm characteristics that explained first-day returns, such as corporate taxes and foreign revenues, accounted for most of the observed momentum. These findings support prominent theories of slow but predictable diffusion of information into prices.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129139341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Assessment of commercial lease arrangements has developed theoretically than empirically at a quite faster pace. The main disparity regarding the pace it has moved so far has been by virtue of the theoretical development whereas the evidence base is hardly explained due to inaccessible large sized database composed of commercial leasing arrangements. Because of this there has been limited empirical studies on the subject matter. Further preliminary studies have more sophisticated and or advanced insight which thus report descriptive features of the samples as against an assessment of their reliability in respect of data and theory.
{"title":"An Assessment of the Elements of Yields and or Returns on Finance Lease Arrangements","authors":"Frederick Anning","doi":"10.2139/ssrn.3205819","DOIUrl":"https://doi.org/10.2139/ssrn.3205819","url":null,"abstract":"Assessment of commercial lease arrangements has developed theoretically than empirically at a quite faster pace. The main disparity regarding the pace it has moved so far has been by virtue of the theoretical development whereas the evidence base is hardly explained due to inaccessible large sized database composed of commercial leasing arrangements. Because of this there has been limited empirical studies on the subject matter. Further preliminary studies have more sophisticated and or advanced insight which thus report descriptive features of the samples as against an assessment of their reliability in respect of data and theory.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125044578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Using new transaction data, I find considerable deviations from consumption smoothing in response to large, regular, predetermined, and salient payments from the Alaska Permanent Fund. On average, the marginal propensity to consume (MPC) is 25% for nondurables and services within one quarter of the payments. The MPC is heterogeneous, monotonically increasing with income, and the average is largely driven by high-income households with substantial amounts of liquid assets, who have MPCs above 50%. The account-level data and the properties of the payments rule out most previous explanations of excess sensitivity, including buffer stock models and rational inattention. How big are these "mistakes"? Using a sufficient statistics approach, I show that the welfare loss from excess sensitivity depends on the MPC and the relative payment size as a fraction of income. Since the lump-sum payments do not depend on income, the two statistics are negatively correlated such that the welfare losses are similar across households and small (less than 0.1% of wealth), despite the large MPCs.
{"title":"Excess Sensitivity of High-Income Consumers","authors":"Lorenz Kueng","doi":"10.2139/ssrn.2627893","DOIUrl":"https://doi.org/10.2139/ssrn.2627893","url":null,"abstract":"Using new transaction data, I find considerable deviations from consumption smoothing in response to large, regular, predetermined, and salient payments from the Alaska Permanent Fund. On average, the marginal propensity to consume (MPC) is 25% for nondurables and services within one quarter of the payments. The MPC is heterogeneous, monotonically increasing with income, and the average is largely driven by high-income households with substantial amounts of liquid assets, who have MPCs above 50%. The account-level data and the properties of the payments rule out most previous explanations of excess sensitivity, including buffer stock models and rational inattention. How big are these \"mistakes\"? Using a sufficient statistics approach, I show that the welfare loss from excess sensitivity depends on the MPC and the relative payment size as a fraction of income. Since the lump-sum payments do not depend on income, the two statistics are negatively correlated such that the welfare losses are similar across households and small (less than 0.1% of wealth), despite the large MPCs.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"61 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126289499","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Using high-frequency data we document that episodes of market turmoil in the European sovereign bond market are on average associated with large decreases in trading volume. The response of trading volume to market stress is conditional on transaction costs. Low transaction cost turmoil episodes are associated with volume increases (investors rebalance), while high transaction cost turmoil periods are associated with abnormally low volume (market freezes). We find suggestive evidence of market freezes in response to shocks to the risk bearing capacity of market makers while investor rebalancing is triggered by wealth shocks. Overall, our results show that the recent sovereign debt crisis was not associated with large-scale investor rebalancing.
{"title":"The Sovereign Debt Crisis: Rebalancing or Freezes?","authors":"P. Östberg, T. Richter","doi":"10.2139/ssrn.3060504","DOIUrl":"https://doi.org/10.2139/ssrn.3060504","url":null,"abstract":"Using high-frequency data we document that episodes of market turmoil in the European sovereign bond market are on average associated with large decreases in trading volume. The response of trading volume to market stress is conditional on transaction costs. Low transaction cost turmoil episodes are associated with volume increases (investors rebalance), while high transaction cost turmoil periods are associated with abnormally low volume (market freezes). We find suggestive evidence of market freezes in response to shocks to the risk bearing capacity of market makers while investor rebalancing is triggered by wealth shocks. Overall, our results show that the recent sovereign debt crisis was not associated with large-scale investor rebalancing.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132682887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Data transformations are commonly used across statistics to transform data distributions into distributions with properties that make them more user friendly. In time-series, stationarity is one of the most common assumptions that is violated because the mean and variance are time dependent. Dick and Fuller (1979) have proven that differencing data can make data stationary. It is also common to try to make data stationary through taking the natural log or using the growth rates of the data instead of the original non-stationary data. There is concern that transforming the data through differencing loses valuable information. This paper purposes a method for measuring data lost from these three types of transformations.
{"title":"Calculating Data Loss for Time-Series Data","authors":"Dimitri Bianco","doi":"10.2139/ssrn.3230502","DOIUrl":"https://doi.org/10.2139/ssrn.3230502","url":null,"abstract":"Data transformations are commonly used across statistics to transform data distributions into distributions with properties that make them more user friendly. In time-series, stationarity is one of the most common assumptions that is violated because the mean and variance are time dependent. Dick and Fuller (1979) have proven that differencing data can make data stationary. It is also common to try to make data stationary through taking the natural log or using the growth rates of the data instead of the original non-stationary data. There is concern that transforming the data through differencing loses valuable information. This paper purposes a method for measuring data lost from these three types of transformations.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115265096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Most assets are traded in multiple interconnected trading venues. This paper develops an equilibrium model of decentralized markets that accommodates general market structures with coexisting exchanges. Decentralized markets can allocate risk among traders with different risk preferences more efficiently, thus realizing gains from trade that cannot be reproduced in centralized markets. Market decentralization always increases price impact. Yet, markets in which assets are traded in multiple exchanges, whether they are disjoint or intermediated, can give higher welfare than the centralized market with the same traders and assets. In decentralized markets, demand substitutability across assets is endogenous and heterogeneous among traders.
{"title":"Decentralized Exchange","authors":"S. Malamud, M. Rostek","doi":"10.2139/ssrn.3146828","DOIUrl":"https://doi.org/10.2139/ssrn.3146828","url":null,"abstract":"Most assets are traded in multiple interconnected trading venues. This paper develops an equilibrium model of decentralized markets that accommodates general market structures with coexisting exchanges. Decentralized markets can allocate risk among traders with different risk preferences more efficiently, thus realizing gains from trade that cannot be reproduced in centralized markets. Market decentralization always increases price impact. Yet, markets in which assets are traded in multiple exchanges, whether they are disjoint or intermediated, can give higher welfare than the centralized market with the same traders and assets. In decentralized markets, demand substitutability across assets is endogenous and heterogeneous among traders.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126856230","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
In general multi-asset models of financial markets, the classic no-arbitrage concepts NFLVR and NUPBR have a serious shortcoming — they depend crucially on the way prices are discounted. To avoid this unnatural economic behaviour, we introduce a new idea for defining “absence of arbitrage”. It rests on the new notion of strongly index weight maximal strategies, which allows us to generalise both NFLVR (by dynamic index weight efficiency) and NUPBR (by dynamic index weight viability). These new no-arbitrage concepts do not change when we look at discounted or undiscounted prices, and they can be used in open-ended models under very weak assumptions on asset prices. We establish corresponding versions of the FTAP, i.e., dual characterisations of our concepts in terms of martingale properties. A key new feature is that as one expects, “properly anticipated prices fluctuate randomly”, but with an endogenous discounting process which is not a priori chosen exogenously. We also illustrate our results by a wide range of examples. In particular, we show that the classic Black–Scholes model on [0,1) is arbitrage-free in our sense if and only if its parameters satisfy m−r e {0, σ²} or, equivalently, either bond-discounted or stock-discounted prices are martingales.
在金融市场的一般多资产模型中,经典的无套利概念NFLVR和NUPBR有一个严重的缺点——它们严重依赖于价格的贴现方式。为了避免这种不自然的经济行为,我们引入了一个定义“无套利”的新概念。它依赖于强指标权重最大化策略的新概念,这使我们能够推广NFLVR(通过动态指标权重效率)和NUPBR(通过动态指标权重可行性)。当我们观察贴现或未贴现价格时,这些新的无套利概念不会改变,它们可以在对资产价格非常弱的假设下用于开放式模型。我们建立了FTAP的相应版本,即我们的概念在鞅性质方面的双重特征。一个关键的新特征是,正如人们所预期的那样,“适当预期的价格随机波动”,但具有内生的贴现过程,而不是先天选择的外生过程。我们还通过广泛的例子来说明我们的结果。特别地,我们证明了[0,1)上的经典Black-Scholes模型在我们的意义上是无套利的,当且仅当它的参数满足m - re {0, σ²},或者,等价地,债券折现价格或股票折现价格是鞅。
{"title":"Making no-arbitrage discounting-invariant: a new FTAP beyond NFLVR and NUPBR","authors":"D. Bálint, M. Schweizer","doi":"10.2139/ssrn.3676499","DOIUrl":"https://doi.org/10.2139/ssrn.3676499","url":null,"abstract":"In general multi-asset models of financial markets, the classic no-arbitrage concepts NFLVR and NUPBR have a serious shortcoming — they depend crucially on the way prices are discounted. To avoid this unnatural economic behaviour, we introduce a new idea for defining “absence of arbitrage”. It rests on the new notion of strongly index weight maximal strategies, which allows us to generalise both NFLVR (by dynamic index weight efficiency) and NUPBR (by dynamic index weight viability). These new no-arbitrage concepts do not change when we look at discounted or undiscounted prices, and they can be used in open-ended models under very weak assumptions on asset prices. We establish corresponding versions of the FTAP, i.e., dual characterisations of our concepts in terms of martingale properties. A key new feature is that as one expects, “properly anticipated prices fluctuate randomly”, but with an endogenous discounting process which is not a priori chosen exogenously. We also illustrate our results by a wide range of examples. In particular, we show that the classic Black–Scholes model on [0,1) is arbitrage-free in our sense if and only if its parameters satisfy m−r e {0, σ²} or, equivalently, either bond-discounted or stock-discounted prices are martingales.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132418029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
We introduce a class of quantile-based risk measures that generalize Value at Risk (VaR) and, likewise Expected Shortfall (ES), take into account both the frequency and the severity of losses. Under VaR a single confidence level is assigned regardless of the size of potential losses. We allow for a range of confidence levels that depend on the loss magnitude. The key ingredient is a benchmark loss distribution (BLD), i.e.~a function that associates to each potential loss a maximal acceptable probability of occurrence. The corresponding risk measure, called Loss VaR (LVaR), determines the minimal capital injection that is required to align the loss distribution of a risky position to the target BLD. By design, one has full flexibility in the choice of the BLD profile and, therefore, in the range of relevant quantiles. Special attention is given to piecewise constant functions and to tail distributions of benchmark random losses, in which case the acceptability condition imposed by the BLD boils down to first-order stochastic dominance. We provide a comprehensive study of the main finance theoretical and statistical properties of LVaR with a focus on their comparison with VaR and ES. Merits and drawbacks are discussed and applications to capital adequacy, portfolio risk management and catastrophic risk are presented.
{"title":"Risk Measures Based on Benchmark Loss Distributions","authors":"V. Bignozzi, Matteo Burzoni, Cosimo Munari","doi":"10.2139/ssrn.3088423","DOIUrl":"https://doi.org/10.2139/ssrn.3088423","url":null,"abstract":"We introduce a class of quantile-based risk measures that generalize Value at Risk (VaR) and, likewise Expected Shortfall (ES), take into account both the frequency and the severity of losses. Under VaR a single confidence level is assigned regardless of the size of potential losses. We allow for a range of confidence levels that depend on the loss magnitude. The key ingredient is a benchmark loss distribution (BLD), i.e.~a function that associates to each potential loss a maximal acceptable probability of occurrence. The corresponding risk measure, called Loss VaR (LVaR), determines the minimal capital injection that is required to align the loss distribution of a risky position to the target BLD. By design, one has full flexibility in the choice of the BLD profile and, therefore, in the range of relevant quantiles. Special attention is given to piecewise constant functions and to tail distributions of benchmark random losses, in which case the acceptability condition imposed by the BLD boils down to first-order stochastic dominance. We provide a comprehensive study of the main finance theoretical and statistical properties of LVaR with a focus on their comparison with VaR and ES. Merits and drawbacks are discussed and applications to capital adequacy, portfolio risk management and catastrophic risk are presented.","PeriodicalId":269529,"journal":{"name":"Swiss Finance Institute Research Paper Series","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116246919","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}