Improving radiation therapies for cancer mathematically
In a paper published in December in the SIAM Journal on Scientific Computing, authors Li-Tien Cheng, Bin Dong, Chunhua Men, Xun Jia, and Steve Jiang propose a method to optimize radiation therapy treatments in cancer patients.
Radiation therapy is one of the primary methods used for cancer treatment, along with chemotherapy and surgery. While doses of radiation are delivered to eliminate cancerous tissue, care is taken to keep radiation within acceptable levels so as not to affect neighboring tissues and organs. The most common type of therapy delivers high-energy radiation via a medical linear accelerator mounted on a rotating apparatus to adjust the direction, and a collimator to shape the beam of radiation. In the recently developed volumetric modulated arc therapy (VMAT), beams continuously deliver doses as the delivery device rotates around the patient. Enhancement of radiotherapy treatment is challenged by complexities of shape optimization, due to the mechanics of the equipment involved as well as the apertures of devices delivering the beams of radiation.
In this paper, the authors develop a variational model and associated numerical techniques for optimization of VMAT treatment plans. The method uses CT scans of patients—with important tissues and organs identified by image segmentation algorithms—to create an improved and customized treatment plan by constructing parameters for an optimal dose distribution in VMAT treatment. Mathematical methods such as binary level-set method for shape optimization are used. Tests have shown improved dose distributions in both model and clinical cases.
View the paper:
SIAM Journal on Scientific Computing, 35(6), B1321–B1340
Evaluating climate models
The simulation of elements affecting the Earth’s climate is usually carried out by coupled atmosphere-ocean circulation models, output from which is used to provide insights into future climate states. Climate models have traditionally been assessed by comparing summary statistics (such as global annual mean temperatures) or point estimates from simulated models to the corresponding observed quantities.
A paper published last December in the SIAM/ASA Journal on Uncertainty Quantification argues that it is more appropriate to compare the distribution of climate model output data (over time and space) to the corresponding distribution of observed data. Distance measures between probability distributions, also called divergence functions, can be used to make this comparison.
The authors evaluate 15 different climate models by comparing simulations of past climate to corresponding reanalysis data. Reanalysis datasets are created by inputting climate observations using a given climate model throughout the entire reanalysis period in order to reduce the effects of modeling changes on climate statistics. Historical weather observations are used to reconstruct atmospheric states on a global grid, thereby allowing direct comparison to climate model output.
View the paper:
SIAM/ASA Journal on Uncertainty Quantification, 1, 522–534
Math models analyze long-term criminal activity patterns in a population
In a paper published last November in Multiscale Modeling and Simulation: A SIAM Interdisciplinary Journal, authors Henri Berestycki, Nancy Rodríguez, and Lenya Ryzhik show that the assumption of a population’s natural tendency towards crime significantly changes long-term criminal activity patterns. The authors use a reaction-diffusion system to study criminal activity. A reaction-diffusion model of a system describes how components of a system interact (in this case population zones and criminals) and predict how one or more of the components (in this case criminals) move in space and time. Reaction-diffusion models have traveling wave solutions where the waveform is a function of a single variable and the wave travels at a constant speed. The authors show that traveling wave solutions of such a model connect zones with no criminal activity with zones of high criminal activity (or hotspots). This corresponds to an invasion of criminal activity into all space. The paper studies the problem of preventing such invasions by employing a finite number of resources that reduce payoff for committing a crime, as well as characterizes the minimum amount of resources necessary to prevent invasion of criminal activity.
While it is natural to expect that in a population with a natural tendency toward crime—based on motivation, opportunity, and running average of crime in an area— a criminal hot or warm spot might form, the authors find an interesting observation in populations with indifference or zero tendency toward criminal activity. In cases where there is indifference, criminal activity begins to dominate with a high enough payoff. Lack of crime here is a steady state, but when it becomes unstable, criminal activity begins to dominate. In the case of populations with a natural tendency to be peaceful or crime-free, there is an interesting interplay between natural tendency and payoff for committing a crime. Here there are three steady states: complete lack of crime, small amount of crime, or a hotspot. Since small amount of crime is usually unstable, the population tends toward either zero crime or a hotspot in the long run.
View the paper:
Multiscale Modeling & Simulation, 11(4), 1097–1126
Mathematically correcting over- and underexposure in photographs
Almost anyone with a camera or smartphone is sure to have noticed that taking pictures in bright conditions, such as a sunny day, can cause a loss of highlight details (or overexposure) in bright regions and a loss of shadow details (or underexposure) in dark regions. A paper published last November in the SIAM Journal on Imaging Science attempts to overcome these photography woes.
In digital photography, “exposure” controls how much light reaches the image sensor and the lightness of the photograph is determined by the amount of light shown. There is a physical limit on the lightness contrast, or separation between the darkest and brightest areas of an image that a camera can capture (called the dynamic range). An outdoor scene with bright or harsh lighting conditions has a much higher dynamic range than the capacity of a regular image sensor—this results in loss of highlight and shadow details.
In their paper aptly titled “Recovering Over-/Underexposed Regions in Photographs,” authors Likun Hou, Hui Ji, and Zuowei Shen present a new wavelet frame-based approach for correcting pixels that are affected by over- and underexposure in photographs. Generic color image restoration techniques like image inpainting and contrast enhancement are not optimized to correct over- and underexposure in digital photography. Hence, the authors use color image restoration methods specifically designed for overexposure correction, which simultaneously correct over- and underexposed regions.
The problem of over-/underexposure is broken down and resolved as three sub-problems: the first is an inpainting problem which recovers the lightness value of over-/underexposed pixels, which are clipped to fit the dynamic range. The second problem deals with adjusting the lightness of the whole image to fit the range while revealing more image details of the darker regions. Third, missing chromatic details of overexposed pixels are recovered from information provided by neighboring well-exposed pixels.
View the paper:
SIAM Journal on Imaging Sciences, 6(4), 2213–2235
*All Nugget Byte source papers will be free on the links above until June 5, 2014.