PRAMS Phase 8 Web Mode Data Collection Metrics

Att 2f - PRAMS Phase 8 Web Mode Data Collection Metrics_03132023.docx

[NCCDPHP] Pregnancy Risk Assessment Monitoring System (PRAMS)

PRAMS Phase 8 Web Mode Data Collection Metrics

OMB: 0920-1273

Document [docx]
Download: docx | pdf

Attachment 2f – PRAMS Phase 8 Web Mode Data Collection Metrics


Pregnancy Risk Assessment Monitoring System (PRAMS) Web Mode Data Collection Metrics



In the spring of 2022, the Office of Management and Budget (OMB) approved a PRAMS request to implement a web survey mode in five sites (Maryland, Puerto Rico, South Carolina, Virginia, and Wyoming). All potential participants within those sites were offered the opportunity to complete the survey by web, mail or telephone interview. A pre-survey letter and up to three mailed surveys are sent to all potential participants. The pre-survey letter introduces PRAMS and informs of the option to complete the survey by paper, web, or telephone. The letters sent prior to and with the mailed (paper) surveys include a URL and a QR code with a unique User ID and password, allowing the sampled mother to securely access the web mode survey. If no response by web or mail is received, up to 15 call attempts are made for each viable phone number. Sampled mothers contacted by phone are provided with the option to complete the survey as an interview over the phone or by mail or web.



All five sites initiated the web mode with their May batch. With a 90-day data collection cycle, data was collected from May until July of 2022. To understand how the addition of the web mode of data collection for responding to the PRAMS survey influenced response rates, data quality, and cost, we assessed changes in response rates overall and by key subpopulations, item nonresponse rate and partial survey completion rates, staff time and postage costs, and frequency of technical support inquiries received from respondents as an indicator of any challenges using the web mode. Our analyses compare whether these key metrics remain the same or improved for the first web batch with the batch immediately prior to the web batch (the April batch with data collection from April through June and no web mode option offered). Findings from the initial implementation of the web mode helps to inform whether and how to scale up the number of PRAMS jurisdictions to include web mode of data collection.



Response Rates: We examined unweighted response rates for the pre-web batch compared to the first web batch overall and by mode of participation. Table 1 presents results for the five sites. Four of the five sites experienced increases in response after adding the web mode ranging from 1 to 7 percentage points, and one site experienced a decline of 7 percentage points. Changes in response were tested using chi-squared tests and none of these changes was statistically significant. Over all five sites, the average difference was an increase of 2.2 percentage points. Breaking out response by mode, the phone response rates were similar before and after the web mode was added. It appears that the web mode mostly drew respondents who would have completed the survey by mail if there had been no web option.

The PRAMS survey used was identical for the before and after batches in all sites except South Carolina. South Carolina initiated a social determinants of health supplement which coincided with the beginning of the web module. Thus, the survey included an extra page of questions for the post-web batch. That may have contributed to the response rate decline seen in South Carolina after the web mode was introduced.



Table 1. Response rates overall and by data collection mode

Maryland

Before Web Mode Added

After Web Mode Added

 

n

%

n

%

Mail

34

21.3

17

10.7

Phone

16

10.00

14

8.8

Web

--

--

28

17.6

All Respondents

50

31.3

59

37.1

Non-Respondents

110

68.7

100

62.9

Total

160

100.0

159

100.0


Puerto Rico

Before Web Mode Added

After Web Mode Added

 

n

%

n

%

Mail

26

21.7

3

3.0

Phone

71

59.2

54

54.6

Web

--

--

24

24.2

All Respondents

97

80.8

81

81.8

Non-Respondents

23

19.2

18

18.2

Total

120

100.0

99

100.0


South Carolina

Before Web Mode Added

After Web Mode Added

 

n

%

n

%

Mail

26

25.2

7

6.9

Phone

10

9.7

8

7.9

Web

--

--

13

12.9

All Respondents

36

34.9

28

27.7

Non-Respondents

67

65.1

73

72.3

Total

103

100.0

101

100.0


Virginia

Before Web Mode Added

After Web Mode Added

 

n

%

n

%

Mail

42

27.3

29

20.6

Phone

16

10.4

17

12.1

Web

--

--

17

12.1

All Respondents

58

37.7

67

44.7

Non-Respondents

96

62.3

78

55.3

Total

154

100.0

141

100.0


Wyoming

Before Web Mode Added

After Web Mode Added

 

n

%

n

%

Mail

23

37.1

17

21.8

Phone

7

11.3

12

15.4

Web

--

--

12

15.4

All Respondents

30

48.4

41

52.6

Non-Respondents

32

51.6

37

47.4

Total

62

100.0

78

100.0



Subpopulation Response: In addition to overall response rates, response by subpopulation was examined to assess response rates before and after implementation of web mode by age, race, ethnicity, and education. The numbers were too small to examine by each site separately; therefore, results were aggregated over the five sites. Table 2 displays unweighted subpopulation-specific response rates for the pre-web batch and the first web batch aggregated over all sites. For all categories of age, race, ethnicity, and education response rates increased after the addition of the web mode with the exception of the less than 20-year-old group, which has the smallest sample size, making it more likely that the decrease reflects random survey response fluctuations. For subpopulations that typically have lower survey response rates like Black women and those with less than a high school education, the addition of a web mode increased response rates by 2.2 and 9.8 percentage points, respectively. Chi-squared tests indicated that none of the changes in response rate by subpopulation were statistically significant.

Table 2. Response rates by Maternal Characteristics, Aggregated over the five sites

Five Sites Aggregate

Before Web Mode Added

After Web Mode Added

Age

n

%

n

%

<20

35


34


Responded

16

45.7

14

41.2

20-29

281


255


Responded

121

43.1

122

47.8

>= 30

283


289


Responded

136

48.1

142

49.1

Race

n

%

n

%

White

377


349


Responded

198

52.5

196

56.2

Black

139


145


Responded

41

29.5

46

31.7

Other

83


84


Responded

34

41.0

36

42.9

Ethnicity

n

%

n

%

Hispanic

185


169


Responded

129

69.7

121

71.6

Not Hispanic

413


405


Responded

144

34.9

155

38.3

Education

n

%

n

%

<12 years

81


58


Responded

26

32.1

24

41.4

>= 12 years

514


514


Responded

245

47.7

254

49.4





Item Nonresponse: Six core indicators on the PRAMS survey to assess the degree of item nonresponse as a measure of data quality. The indicators were selected based on their location on the survey with two at the beginning of the survey, two in the middle, and two at the end of the survey. The selected indicators are not part of any skip patterns so all respondents should have provided a valid response. Table 3 displays the overall and mode-specific item nonresponse rate for each indicator aggregated over all five sites. Item nonresponse rates for the web mode were consistently lower than corresponding rates for mail and phone modes. The income question elicits the highest item nonresponse rate of all survey items at over 10%. Mail and phone respondents had item nonresponse rates 4.5 and 20.5 times higher, respectively, than web respondents for the income question. As the income and household size questions are the last two core questions on the survey, the low missing rate for web indicates that web respondents are completing the entire survey and not dropping off early.

Table 3. Item Nonresponse Rate Overall and by Mode

Indicator

Survey Location

Mode of Participation

 

Mail (n=226)

Phone (n=226)

Web (n=96)

Overall

%

%

%

%

Pregnancy intention

Beginning

0.7

0.7

0.0

1.4

Multivitamin use

Beginning

0.0

0.2

0.0

0.2

Physical abuse before pregnancy

Middle

0.4

0.0

0.0

0.4

Maternal postpartum check-up

Middle

0.6

0.6

0.0

1.1

Income

End

1.8

8.2

0.4

10.4

Household size

End

0.6

1.5

0.4

2.4





Postage Cost: Early adopter sites tracked their mailings and associated postage costs by batch. The entire sample receives the introductory letter and then up to 3 mailings as needed (first survey packet, tickler/thank you, second survey packet, third survey packet). Mailings including survey materials are sent in larger envelopes than the introductory letter and thank you letter. Table 4 presents postage costs from the five sites. All sites experienced a reduction in postage costs after implementing the web mode. This result was expected because the web mode allows earlier completion of the survey resulting in fewer follow-up mailings needed. The reduction ranged from 4% to 13% of the pre-web costs. Note that monthly batch sizes will vary, and costs will vary by the size of the batch. We adjusted the costs to account for differential batch sizes. Adjustments were made by dividing the total pre-web batch sample size (number of sampled mothers) by the total post-web batch size and then multiplying by the unadjusted post-web postage cost to obtain the adjusted post-web postage cost.



Table 4. Postage Costs by Site

Maryland

Before Web Mode Added

After Web Mode Added

Number of Surveys

Cost of Postage

Number of Surveys

Cost of Postage

Mail survey packets

770

$959

702

$856

Return mail surveys

35

$62

17

$30

Total postage cost


$1,021


$886

Percentage change in cost


-13.2%




Puerto Rico


Before Web Mode Added

After Web Mode Added

Number of Surveys

Cost of Postage

Number of Surveys

Cost of Postage

Mail survey packets

574

$1,159.00

524*

$1,044.00*

Return mail surveys

n/a

n/a

n/a

n/a

Total postage cost


$1,159.00


$1,044.00*

Percentage change in cost

 

-9.9%

 

 


South Carolina

Before Web Mode Added

After Web Mode Added

Number of Surveys

Cost of Postage

Number of Surveys

Cost of Postage

Mail survey packets

535

$705.00

532*

$697.00*

Return mail surveys

26

$41.00

9

$14.00

Total postage cost


$746.00


$711.00*

Percentage change in cost


-4.7%




Virginia

Before Web Mode Added

After Web Mode Added

Number of Surveys

Cost of Postage

Number of Surveys

Cost of Postage

Mail survey packets



713



$1,035.00



697*



$993.00*

Return mail surveys

n/a

n/a

n/a

n/a

Total postage cost


$1,035.00


$993.00*

Percentage change in cost


-4.1%




Wyoming

Before Web Mode Added

After Web Mode Added

Number of Surveys

Cost of Postage

Number of Surveys

Cost of Postage

Mail survey packets

411

$431.00

392*

$424.00*

Return mail surveys

23

$40.50

14*

$23.80*

Total postage cost


$471.50


$447.80*

Percentage change in cost


-5.0%



*Numbers adjusted for differences in batch size



Staffing Hours: Early adopting sites tracked the staff time required to conduct data collection activities. Activities tracked included preparation of mailings, mail data entry, and telephone interviewing. The hours reported for telephone interviewing include all time making calls whether they resulted in a completed interview or not. Table 5 presents staffing hours expended from the five sites. Four of the five sites reported a reduction in staff hours after implementation of the web mode. The reduction ranged from 41% to 6% of the pre-web staff hours. The one site (Wyoming) that experienced an increase in staff hours explained that, because web mode respondents have a choice of reward, staff had to contact each respondent individually to confirm reward selection. Staff did not have to contact mail or phone respondents because the mailed surveys included a card for respondents to select their choice of reward and return with the survey and phone respondents are asked their preferred reward at the end of the interview. The other four sites either provide the same reward to all participants or do not provide a reward for web respondents and so did not need to follow-up. Note that monthly batch sizes will vary, and staff hours will vary by the size of the batch. The hours were adjusted to account for differential batch sizes. Adjustments are made by dividing the total pre-web batch sample size (number of sampled mothers) by the total post-web batch size and then multiply by the unadjusted staffing hours to obtain the adjusted post-web staffing hours.


Table 5: Staff hours expended by site

Maryland

Before Web Mode Added

After Web Mode Added

Number of Surveys

Staff Hours

Number of Surveys

Staff Hours

Mail assembly

770

5.4

702

4.2

Mail data entry

35

2.7

17

1.3

Telephone interviews

--

114

--

66

Total Staff Hours


122.1


71.5

Percentage change in staff hours


-41.4%




Puerto Rico

Before Web Mode Added

After Web Mode Added

Number of Surveys

Staff Hours

Number of Surveys

Staff Hours

Mail assembly

574

103

520*

97.7*

Mail data entry

26

13

3

1.5

Telephone interviews

--

106

--

33*

Total Staff Hours


222


132.2*

Percentage change in staff hours


-40.5%




South Carolina

Before Web Mode Added

After Web Mode Added

Number of Surveys

Staff Hours

Number of Surveys

Staff Hours

Mail assembly

535

32

532*

26*

Mail data entry

26

8.7

9

3

Telephone interviews

--

36

--

43*

Total Staff Hours


76.7


72*

Percentage change in staff hours


--6.1%




Virginia

Before Web Mode Added

After Web Mode Added

Number of Surveys

Staff Hours

Number of Surveys

Staff Hours

Mail assembly

713

34.2

697*

31.9*

Mail data entry

42

3.5

29

2.4

Telephone interviews

--

40.3

--

30.6*

Total Staff Hours


78


64.9*

Percentage change in staff hours


-16.8%




Wyoming

Before Web Mode Added

After Web Mode Added

Number of Surveys

Staff Hours

Number of Surveys

Staff Hours

Mail assembly

288

7.2

276*

8.5*

Mail data entry

23

3.1

3

2.3

Telephone interviews

--

30.3

--

36.6*

Total Staff Hours


40.6


47.4*

Percentage change in staff hours


16.7%





Technical Support Queries: Early adopter sites tracked the number of phone and email queries they received from sampled mothers about the PRAMS survey. Across 5 sites, there were eight queries in the pre-web batch and 8 queries in the post-web batch. We did not collect information on the nature of the queries that were received. The consistent number of queries implies that the instructions and interface for the web module are not causing confusion, and sampled mothers can access and navigate the web system without additional technical support.



Summary

In summary, metrics related to response rates, item nonresponse rates, costs, staff burden, and user queries to assess the impact of adding a web survey mode to PRAMS. Across each metric examined, the post-web metrics indicated some modest overall improvement relative to their pre-web counterparts. After the web mode was implemented, there was a two percentage point increase in response rates, on average, improved response rates for all categories of race, ethnicity, and education level examined, lower item nonresponse as compared to mail and telephone modes, reduced postage costs by 4% to 13%, reduced staff hours expended by 6% to 41% in all but one site, and did not result in additional technical support queries from sampled mothers.




File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBauman, Brenda (CDC/DDNID/NCCDPHP/DRH)
File Created2024:11:24 07:54:36Z

© 2024 OMB.report | Privacy Policy