|Mario Wuthrich||Title: LocalGLMnet: an interpretable deep learning architecture|
Abstract: We present a new deep learning model called the LocalGLMnet. While deep learning models lead to very competitive regression models, often outperforming classical statistical models such as generalized linear models, the disadvantage is that deep learning solutions are difficult to interpret and explain, and variable selection is not easily possible. Inspired by the appealing structure of generalized linear models, we propose a new network architecture that shares similar features as generalized linear models, but provides superior predictive power benefiting from the art of representation learning. This new architecture allows for variable selection of tabular data and for interpretation of the calibrated deep learning model.
Reference: SSRN Preprint
|Marie-Pier Côté||Title: Micro-level reserving for general insurance claims using an LSTM network
Abstract: Detailed information about individual claims are completely ignored when insurance claims data are aggregated in development triangles for loss reserving. In the hope of extracting predictive power from the individual claim characteristics, researchers recently proposed to move away from these macro-level methods in favor of micro-level loss reserving approaches. We introduce a discrete-time individual reserving framework incorporating granular information in a deep learning approach named Long Short-Term Memory (LSTM) neural network. At each time period, the network has two tasks: first, a classification of whether there is a non-zero payment, and second, a prediction of the corresponding non-zero amount. A generalized Pareto model for excess payments over a threshold allows us to adjust the LSTM reserve prediction to account for extreme payments. We illustrate the method and results on a simulated and a real P&C insurance dataset.
|Brian Fannin||Title: Tomorrow's Actuary: The Changing Face of Risk Engineering
Abstract: Technology has changed the character of insurable risk, but also the way that actuaries approach the analysis of such risk. There is an ongoing shift in the tools that actuaries use in their work. In order to better understand this movement, in the summer of 2021, the CAS sent a technology survey to its members and candidates. In addition, the CAS has broadened its support for research which examines operational and algorithmic bias, with specific attention to the use of race and other protected classes in insurance rating. This talk will examine those items as well as several others to try and get a sense for how actuaries are adapting to meet the demands of tomorrow.
|Andrew Mackenzie||Title: Payment Accuracy in Value Based Care|
Abstract: Value based care (VBC) arrangements and alternative payment models are growing quickly in the US healthcare system. The idea is to share more holistic financial risk with entities that have more control over member spending and utilization instead of simply paying fixed fees per service rendered, and to reward entities that increase care efficiencies in payment and quality. Successful VBC arrangements should result in better patient care and less expensive care. In order to appropriately recognize and quantify value created by the VBC participants, there needs to be a way to measure what costs and utilization would have been if the VBC program was not in place (in other words, there needs to be an ability to quantify an appropriate baseline of care). Due to the stochastic nature of claims, the presence of medical trend, and the lack of a pure randomized control environment, there usually exists material levels of measurement risk in calculating any VBC performance payment. Techniques exist to reduce the size of measurement error, but VBC arrangements are often designed without a complete understanding of the likely size of measurement risk and how that can interfere with reported outcomes and ultimately performance payments. Failure to appropriately understand or mitigate measurement risk can result in substantial financial losses to participants in VBC programs. Our work specializes in quantifying such measurement risk and determining appropriate levels of mitigation considering the costs and benefits of doing so.
|Josh Otis||Title: Telematics at Nationwide Insurance|
Abstract: The rise of smartphone ownership and usage has accelerated over the past decade and profoundly changed society. While much of the change has been positive, change can often outpace society’s ability to adapt appropriately. The fact of the matter is that distracted driving, and specifically phone use behind the wheel, is a huge societal problem, one that will continue to get worse if action is not taken. Predictions from our telematics provider are that by the year 2025, we'll have 4,000 fatalities and over 500,000 crashes annually attributed directly to phone-related distractions. Fortunately, the root of the problem will also be part of the solution. Telematics data obtained from phone sensors is helping us to understand risk, price insurance accordingly and work to change driving behavior to make our roads safer.
|Peng Shi||Title: Enhancing Claims Triage with Dynamic Data
Abstract: In property insurance claims triage, insurers often use static information to assess the severity of a claim and identify the subsequent actions. We hypothesize that the pattern of weather conditions throughout the course of the loss event is predictive of insured losses and hence appropriate use of weather dynamics improve the operation of insurer's claim management. To test this hypothesis, we propose a deep learning method to incorporate the dynamic weather data in the predictive modeling of insured losses for reported claims. In the empirical analysis, we examine a portfolio of hail damage property insurance claims obtained from a major U.S. insurance carrier. We show that leveraging weather dynamics in claims triage lead to a substantial reduction in operational cost.