For over a decade, address-based sampling (ABS) frames have often been used to draw samples for multistage area sample surveys in lieu of traditionally listed (or enumerated) address frames. However, it is well known that the use of ABS frames for face-to-face surveys suffer from undercoverage due to, for example, households that receive mail via a PO Box rather than being delivered to the household's street address. Undercoverage of ABS frames has typically been more prominent in rural areas but can also occur in urban areas where recent construction of households has taken place. Procedures have been developed to supplement ABS frames to address this undercoverage. In this article, we investigate a procedure called Address Coverage Enhancement (ACE) that supplements the ABS frame with addresses not found on the frame, and the resulting effects the addresses added to the sample through ACE have on estimates. Weighted estimates from two studies, the Population Assessment of Tobacco and Health Study and the 2017 US Program for the International Assessment of Adult Competencies, are calculated with and without supplemental addresses. Estimates are then calculated to assess if poststratifying analysis weights to control for urbanicity at the person level brings estimates closer to estimates from the supplemented frame. Our findings show that the noncoverage bias was likely minimal across both studies for a range of estimates. The main reason is because the Computerized Delivery Sequence file coverage rate is high, and when the coverage rate is high, only very large differences between the covered and not covered will result in meaningful bias.
Responsive survey design (RSD) aims to increase the efficiency of survey data collection via live monitoring of paradata and the introduction of protocol changes when survey errors and increased costs seem imminent. Daily predictions of response propensity for all active sampled cases are among the most important quantities for live monitoring of data collection outcomes, making sound predictions of these propensities essential for the success of RSD. Because it relies on real-time updates of prior beliefs about key design quantities, such as predicted response propensities, RSD stands to benefit from Bayesian approaches. However, empirical evidence of the merits of these approaches is lacking in the literature, and the derivation of informative prior distributions is required for these approaches to be effective. In this paper, we evaluate the ability of two approaches to deriving prior distributions for the coefficients defining daily response propensity models to improve predictions of daily response propensity in a real data collection employing RSD. The first approach involves analyses of historical data from the same survey, and the second approach involves literature review. We find that Bayesian methods based on these two approaches result in higher-quality predictions of response propensity than more standard approaches ignoring prior information. This is especially true during the early-to-middle periods of data collection, when survey managers using RSD often consider interventions.