AGI Logo

Return to Title page

Chapter 2: Study Objectives and Methodology

Abstract

Measuring the costs and benefits of geological mapping by State Geological Surveys (SGS) and the U.S. Geological Survey (USGS) involved the distribution of two questionnaires. The first questionnaire compiled data on SGS and USGS costs for geological mapping, while the second gathered comprehensive stakeholder assessments of the usefulness and value of geological maps (i.e., benefits data). For SGS, federal funding sources were from the STATEMAP program of the National Cooperative Geologic Mapping Program, other USGS mission areas, and other federal agencies. State funding sources included the 1:1 match requirement for funds received under the STATEMAP program, funding from other state agencies, as well as from county, municipal, private industry, and nongovernmental organizations (NGOs). USGS federal funding sources were those received directly from congressional appropriations, as well as from other USGS mission areas involved with geological mapping and other federal agencies. To acquire data on valuation, an online questionnaire was sent to >81,000 stakeholders and nearly 4,800 responses were received (~6% response rate). Stakeholder categories included individuals representing economic development, NGOs, state and local government agencies, associations and societies, consulting companies, large industries, rock and mineral clubs, etc. Specific map-value questions were easily tabulated. However, to overcome the review of the overwhelming responses (~700 pages) to several long text-based narrative questions, training data were used to analyze word-use frequency to generate additional predictive keywords.

2.1: Introduction

The primary purpose of publicly funded institutions such as SGS and the USGS is to generate scientific knowledge of geology and make it available for natural resources, geological hazard, economic, and environmental applications. Geological maps present this knowledge in a concise form and are supported by reports and data sets to enhance and interpret the maps. The process is a two-way street, where feedback from users of geological maps and reports helps identify what kind of geological knowledge is needed in practice and which geographical areas need prioritization for geological mapping. Businesses and public policy makers require geological information to guide investment decisions as well as balance economic development with evaluations of natural resources (water, mineral, and energy) and geological hazards, and in so doing address environmental and public safety issues. The continuous interaction between users of geological knowledge and its generators is key to maintaining the quality, efficiency, and usefulness of the process.

Unlike some physical commodities used as ingredients in the production of other goods, scientific knowledge such as geological maps, data, and reports are not “consumed” by its users, but rather remain available for decades of use. The maps and reports commonly need to be enhanced, adapted, and/or modified to suit the application. For example, for most users it is not sufficient to create only site-specific geological knowledge. It is generally beyond the user’s ability and means to generate geological knowledge outside of a specific project site. In cases where some users may have the means to generate geological knowledge beyond the project site, they are unlikely to make it freely available to others. If other users must each create the same geological knowledge repeatedly as needed, the result is economic inefficiency. It is essential that publicly funded agencies take the responsibility of creating geological knowledge and making it available as a “public good”. Therefore, the basic methodology for conducting this assessment on the value of geological maps is based on the premise that geological knowledge (maps, data, reports) is a “public good”.

The economic justification for handling a “public good” differently from a “private good” has been discussed in previous studies on costs and benefits of geological maps. In the U.S., such studies have been conducted in several states such as Illinois (Bhagwat and Berg, 1991), Kentucky (Bhagwat and Ipe, 2000), Nevada (Bhagwat, 2014), Ohio (Kleinhenz & Associates, 2011), and Indiana (Capstone Class, 2017). Briefly, unlike a private good, such as a mobile phone or a car, a public good can be procured by many at the same time without being “consumed”. It remains available for others in the present and in the future. Therefore, the benefits of public goods to society are additive over many users.

The benefits of geological knowledge to society are measured indirectly because, as a public good, this knowledge is provided free or at minimal cost, mostly equivalent to the cost of printing and/or helping to maintain a website where geological maps are served. The consumer does not pay a market determined price. However, having geological knowledge can avoid some costs to the consumer in terms of time saved to gather the knowledge and by avoiding other costs that may incur from the lack of adequate knowledge of geology. Cost savings and cost avoidance are concepts used in business management, which differ from one another in that cost savings refers to known expenses that could be saved by taking certain actions, whereas cost avoidance refers to anticipated future costs that could be avoided by taking certain actions now. Unlike current costs, future costs are unknown. Therefore, cost avoidance necessarily involves estimation of future costs that seem rational. Management steps taken in the present can be justified by the expectation that they will lead to savings in the future. In short, avoided costs are equivalent to benefits (e.g., Lizzou et al., 2019; Chiavacci et al., 2022). Specific literature concerning public goods, such as geological maps, has been cited (e.g., Bhagwat and Ipe, 2000; Garcia-Cortes et al, 2005; Kleinhenz & Associates, 2011; Bhagwat, 2014). In the case of SuperFund sites, the criteria used by the U.S. Environmental Protection Agency (USEPA) to determine how much contribution to expect from entities responsible for the pollution of sites targeted for clean-up are known and listed. The underlying rationale for the SuperFund program is that expected societal costs caused by the environmental pollution are greater than the clean-up costs, even though the societal costs are not known. The value of geological knowledge to the user may depend instead on the amount of time and money that the user may otherwise have to spend to create the knowledge themselves. Using geological maps may affect the economic outcome of projects, but the extent of this effect and whether it influences how much the user is willing to pay were not investigated.

The usefulness of the above approach has been tested and confirmed by others who conducted such studies in the U.S. and overseas, (e.g., Bhagwat and Ipe, 2000; Garcia-Cortes et al., 2005; Kleinhenz & Associates, 2011; Bhagwat, 2014), as well as by reviewers of economic literature at academic institutions (e.g., Häggquist and Söderholm, 2015). A brief summary is discussed in Chapter 1.

This report is the first of its kind at the national level in the U.S. It consists of two major parts. Chapters 3, 4, and 5 take stock of public perceptions of geological maps produced by SGS and the USGS, funds spent on geological mapping, and the extent of mapping accomplished. Second, the report solicits user input on map preferences, the usefulness of maps and their perceived value, as well as user input to guide future mapping, as addressed in Chapters 6, 7, 8, 10, 11, and 12. To accomplish this, two different questionnaires were drafted.

The first questionnaire, designed to compile data on the costs or spending for geological mapping, mapping accomplishments, and future mapping needs was sent to SGS and the USGS. It essentially consisted of a spreadsheet within which funding allocations from state, federal, and other sources were tabulated for individual SGS and the USGS for the 1994 to 2019 time period. In addition, it requested information on the proportions of completed mapping for bedrock and Quaternary geology at specific scales, as well as progress to date on a variety of derivative maps.

The second questionnaire, designed to seek comprehensive assessments of the usefulness and value of geological maps, was distributed by SGS to traditional map users and stakeholders. It consisted of 25 questions requesting information on the respondent’s (1) type of organization (e.g., various types of private vs. public institutions); (2) activities related to geological maps; (3) estimates of time and costs saved by having access to publicly available geological maps; (4) type of preferred map product (e.g., digital vs. paper copy); (5) descriptive narrative of the benefits of publicly available maps; (6) approximations of additional incurred costs on individual projects if maps were not publicly available; (7) willingness to pay for geological maps if not publicly available; (8) perception of the long-term value of geological maps; (9) preferred scale of maps; (10) inferred importance of digital online access to geological maps; (11) descriptions of how maps are obtained for projects if not publicly available; (12) ratings of map quality from various organizations (e.g., government vs. private); (13) perceived impacts of publicly available geological maps on the quality of projects; and (14) priority areas for future geological mapping.

Both questionnaires had significant input on content and review from a Steering Committee that consisted of Richard Berg (Director of the Illinois State Geological Survey), James Faulds (Director of the Nevada Bureau of Mines and Geology), Steven Masterman (Retired Director of the Alaska Division of Geological and Geophysical Surveys), John Parrish (Retired Director of the California Geological Survey), David Spears (Director [now retired] of the Virginia Department of Mines, Minerals, and Energy), Nick Tew (Director of the Alabama Geological Survey), and Richard Bernknopf (USGS-Retired and now with the University of New Mexico).

2.2: Data Acquisition — ​Cost Information

Following review and approval by the Steering Committee of the Excel spreadsheet for obtaining cost information from SGS and the USGS, the data gathering process commenced for this national economic analysis of the costs and benefits of geological mapping. On July 1, 2020, an email was sent to State Geologists and the National Cooperative Geological Mapping Program (NCMGP) coordinator of the USGS requesting their full participation in the national assessment. The email contained (1) the blank Excel cost spreadsheet (Appendix 1) requesting their data on annual geological mapping costs from 1994–2019, present-day staffing, geological map coverages at various scales, and derivative mapping, and (2) an introductory letter detailing the program and its timelines. September 15, 2020 was given as the submission date of the cost information. However, this deadline was incrementally increased several times as SGS and the USGS requested additional time because of problems associated with (1) obtaining the cost data going back to 1994; (2) assessing geological mapping coverages and status of derivative maps; and (3) the Covid-19 pandemic. Therefore, to ensure completeness of the national assessment, cost sheets were accepted through September 2021.

The cost sheet contains three sections. Section 1 provides cost data from federal, state, and other sources estimated to the best of the abilities of SGS and the USGS. For SGS, federal funding sources included those from the STATEMAP program of the NCGMP, as well as from other USGS Mission Areas and other federal agencies. Much of the geological mapping funds that SGS received were from the STATEMAP program, and those data were readily available from an annually updated USGS spreadsheet. State funding sources included the 1:1 match requirement for funds received under the STATEMAP program, funding from other state agencies, and any personnel or other costs that contributed to geological mapping. Other sources included funding from county, municipal, private industry, and non-governmental organizations (NGOs, e.g., typically non-profit entities).

USGS federal funding sources included those received directly from congressional appropriations under the 1992 (and subsequent reauthorizations) National Geologic Mapping Act requirements, as well as from other USGS Mission areas involved with geological mapping and other federal agencies. USGS figures do not include funds received for STATEMAP, since those funds were distributed directly to SGS.

Also requested was a best estimation of the number of internet visitors, with the realization that these data may be difficult to assess by SGS and the USGS.

Section 2 of the spreadsheet documents geological mapping that was accomplished from 1994 to 2019 on a per square mile and percentage of jurisdiction basis. Also included were data on the extent of geological mapping at various scales (from >1:24,000 to <1:500,000) and, if possible, split between Quaternary and bedrock mapping products.

Section 3 of the spreadsheet focuses on derivative maps at small, medium, and large scale, including present-day availability, what needs updating, and desired future products. A list of 25 derivative options were provided with the proviso to add others to the list. It was also stressed that the production of derivative maps was dependent on geological mapping, and therefore derivative map costs should be included in Section 1. It would be too difficult and nearly impossible to separate costs between derivative and basic maps.

Between July 2020 and September 2021 cost sheets were obtained for 49 states. Hawaii is the only state that lacked an SGS over the 1994–2019 project period and therefore could not provide any cost data, nor send out questionnaires to stakeholders. However, members of the steering committee worked with colleagues in Hawaii to distribute the stakeholder questionnaire. Since stakeholder data (via national efforts described below) were obtained for Hawaii, as well as the District of Columbia, it was assumed that geological mapping for these two jurisdictions were covered with direct USGS funds. Also, for two other states (​Georgia, which has not had an active SGS for several years, and Louisiana, which was transitioning to find a new State Geologist), there were no responses to email requests for participation in the assessment. Therefore,​ cost sheets were produced showing only STATEMAP funding and the required 1:1 state match.

For many of the SGS, the 1:1 match data were difficult to obtain, as these data were commonly not retained as paper copies or early computer files. Fortunately, the USGS NCGMP program office provided much of the needed match data. The match data are significant, because many states matched the Federal STATEMAP funds considerably greater than the required 1:1, as they were trying to justify their capacity for increasing Congressional and USGS STATEMAP funding.

2.3: Data Acquisition — Valuation Information

Following review and approval by the Steering Committee of the online questionnaire seeking information (Appendix 2) on the benefits of geological mapping, as well as following numerous beta testing of the questionnaire’s online operability, a second email was sent to SGS on August 20, 2020. This email contained an online link to the questionnaire and requested distribution of the link to stakeholders and constituents. To increase the size of state stakeholder lists, it was requested that they extend their stakeholder engagement to include statewide associations and societies for oil and gas, aggregates, water-well drillers, etc., with the intent that these statewide groups could send the questionnaire to their members and thereby significantly increase feedback. November 2, 2020 was given as an initial submission date for the questionnaires. Similar to obtaining the cost data, this deadline also was incrementally increased several times. Submission extensions were needed for two primary reasons: (1) the time required for national and state associations and societies forwarding the questionnaire to their members was underestimated due to delays in obtaining required permissions from their management as well as timing of the distribution of the questionnaire in a monthly mailing or a newsletter, and (2) delays due to the Covid-19 pandemic, which prevented face-to-face participation at association and society meetings to encourage stakeholders to participate. Therefore, to ensure completeness of the national assessment, questionnaire responses also were accepted through September 2021.

Also provided was a Word document with a template letter that SGS could modify accordingly and then send to their stakeholders and constituents. For example, emphasizing the importance of this endeavor to the mining industry is quite different from that to county planning agencies. Appendix 3 is an example of one letter that was distributed to economic development agencies. We asked that the number of distributed questionnaires be recorded, minus bounce backs, so that we could best evaluate the overall response rate. Stakeholders were also asked to answer as many of the questions as possible, with the full realization that all questions could not be answered by everyone.

Between August 2020 and September 2021, the email link to the online questionnaire was reported as sent to 81,072 stakeholders and constituents, and 4,779 responses were received, which was a ~6% response rate. Of those 81,072 questionnaires that were sent, 25,192 were sent by 10 national associations and societies (Table 2.3.1) as well as by numerous state associations and societies. For example, in Illinois, 10,247 questionnaires were sent by 17 associations and societies (Table 2.3.2). Participating national and state associations included those representing professional geologists, planners, and water professionals, as well as those from mining, the construction industry, state, city and county governments, academia, and the engineering community, all of which are direct and indirect beneficiaries of geological mapping produced by SGS and the USGS.

Table 2.3.1. Questionnaire Distribution to National Associations and Societies

American Council of Engineering Companies.
American Inst. of Mining, Metallurgical, Petroleum Engineers.
American Institute of Professional Geologists.
American Institute of State Boards of Geology.
American Planning Association.
American Water Works Association — included with 28 states numbers.
Geological Society of America (5 Divisions).
Industrial Minerals Association, North America.
National Asphalt Paving Association.
National Mining Association.

Table 2.3.2. Questionnaire Distribution to State Associations and Societies — Illinois Example

Association of General Contractors of Illinois.
Chicagoland Association of General Contractors.
Great Lakes Construction Association.
Illinois Asphalt Paving Association.
Illinois Association of Aggregate Producers.
Illinois Association of Counties.
Illinois Chapter, American Planning Association.
Illinois Coal Association.
Illinois Municipal League.
Illinois Oil & Gas Association.
Illinois Road Transportation Builders Association.
Illinois Rural Water Association.
Illinois Section, American Society of Civil Engineers.
Illinois Section, American Water Works Association.
Illinois Society of Professional Engineers.
Illinois Underground Contractors Association.
Structural Engineering Association of Illinois.

The number of questionnaires that was sent (81,072) was a minimum figure. Despite regular reminders to national and state associations and societies to report the number of questionnaires distributed to their members, several failed to report, and even among those who reported numbers, there was not any control on individuals forwarding the questionnaire link to others. Also, some of the organizations posted the questionnaire on their website, with little reporting of the number of “hits”. In fact, all stakeholders were encouraged to forward the link, knowing full well that actual responses were obviously more significant than sends.

The online nature of the questionnaire and how mechanisms of its distribution to members of national and state associations and societies might differ required a standardized approach depending on whether the questionnaire link and an explanation of the program was either (1) in a direct email, or (2) included in a bi-weekly, monthly, or quarterly newsletter. For the former, they simply provided the number of emails that were directly sent, and that number became part of the 81,072 sent questionnaires. For the latter, when they reported the number of newsletters that were sent, and not knowing if the newsletters were opened, we followed up with a request for how many of the newsletter emails were opened. For both of the above scenarios, we assumed that an opened email and an opened newsletter were similar to opening a piece of “hard mail” containing the questionnaire and then filling it out. Most were able to report the number of newsletter emails that were opened, and for those few that did not, we used the number of newsletters that were sent as a part of the 81,072 sent questionnaires.

A total of 55,880 questionnaires were sent by SGS directly to their individual constituents. It was assumed that SGS and the USGS stakeholders were the same pool, and therefore to reduce duplication of effort, only the SGS distributed the questionnaire link.

Twelve SGS either asked for assistance in assembling lists of stakeholders or had no capacity to do so. Therefore, staff at the Illinois State Geological Survey (ISGS) developed stakeholder lists for these 12 states and did so based on extensive web searches. As an example of the stakeholder categories chosen to receive the questionnaires, the ISGS classified stakeholders into categories including: (1) economic development; (2) NGOs; (3) state government (planning, engineering, water resources, emergency management, EPAs, public health, natural resources, and mining); (4) county and municipal government (planning, zoning, highways/engineering, GIS, emergency management, public health, and real estate); (5) associations and societies; (6) excavation, construction, and site development companies; (7) environmental, geotechnical, and engineering companies; (8) rock and mineral clubs; and (9) conservation districts. They also developed customized email text to accompany the questionnaire link. Those SGS that requested assistance were then provided with their stakeholder lists, asked to review it, and add or delete entries. They were then given the option of sending the questionnaire to their stakeholders or jointly sending it with ISGS project staff, and importantly reporting the number of distributed questionnaires back to the project staff. For SGS lacking capacity to participate in the project, stakeholder lists with a customized email to stakeholders, were distributed by ISGS project staff.

It was recognized in the first few months of acquiring questionnaire responses that ~40% of respondents had not completed the online form. Those respondents were identified, provided access to their original submissions, and then given the opportunity to complete the questionnaire and re-submit the form. Unfortunately, only ~150 respondents availed themselves of the opportunity. However, all questions answered by the respondents were accepted and are part of the database.

There was also concern with the viability of the response rate, following communication from some respondents regarding their reluctance to click on the questionnaire link for fear that it would lead them to an unsafe website. This concern was despite clear identification of the program and who was conducting it, its goals and outcomes, and direct phone and email contact information from those distributing the questionnaire. Others responded that they did not complete the form, saying that the questionnaire was too long. However, it was made clear in introductory emails and letters that were distributed with the questionnaire that respondents did not have to answer all questions, but only those that they felt qualified to do so. Also, they did not have to provide long text answers to several questions.

Research on response rates for surveys does not provide a definitive guide to their adequacy. Online surveys usually have lower response rates than in person surveys. Wu et al. (2022) conducted “a comprehensive search, screened 8,672 studies, and examined 1,071 online survey response rates reported in education-related research….The average online survey response rate was 44.1%….. sending an online survey to more participants did not generate a higher response rate. Instead, sending surveys to a clearly defined and refined population positively impacts the online survey response rate. In addition, pre-contacting potential participants, using other types of surveys in conjunction with online surveys, and using phone calls to remind participants about the online survey could also yield a higher response rate….Other factors that impacted the rates included the funding status of a project, and the age and occupation of the participants.”

Marketing companies work with their own assessments or survey response rates. For example, Malnik (2023) reported a range of response rates depending on the survey method. The average good response rate was reported as 30%, whereas a good online survey response rate was reported at 29%.

Lastly, previous economic analyses that evaluated costs and benefits in the discipline of geology showed a wide range of response rates — ​Kentucky (20%) (Bhagwat and Ipe, 2000), Spain (26%) (Garcia-Cortes, et al, 2005), Nevada (4.6%) (Bhagwat, 2014), Indiana (28.5%) (Capstone Class 7933, V-600, 2017), and Ohio (63.6%) (Kleinhenz & Associates, 2011). The 6% response rate of the present study needs to be viewed differently from these previous studies because previous studies all covered relatively small geographic areas compared to the present study which covered the entire U.S. The method of reaching intended audiences of the present study had to be less direct. The survey included many more questions than in previous studies and many queries required descriptive responses. Long and descriptive surveys tend to elicit fewer responses.

2.4: Database Development

2.4.1: Questionnaire Response Data

The questionnaire yielded 4,779 viable response sets (then reduced to 4,577 by deleting those from SGS and foreign-only respondents) from geoscience and other stakeholders nationwide (see Chapter 2). Raw data was received from a contracted third-party online survey vendor in the form of a Microsoft (MS) Excel flat file report. Prior to analysis, these data were transformed into an MS Access relational database format. The relational database model improved machine readability and facilitated powerful query operations via built-in Structured Query Language (SQL). The database also provided a convenient package for query versioning and portability, while integrating well with various analytical tools such as R, Python, and GIS software.

In migrating these data, the following cleaning and quality-of-life transformations were made:

  • Implementation of controlled vocabulary — common categorical responses were identified among several free-response questions. For these responses, various spellings and abbreviations of like categorical values were assimilated to establish controlled domains. This practice simplified SQL operations for selecting and filtering the data.

  • Feature scaling of disparate ranked data to common scales — the questionnaire contained several groupings of questions that asked for a ranked response, typically evaluating expert opinion. However, different questions employed varying bin scales (e.g., 1–5, 1–10, etc.). This contrast was not recognized during the pre-survey review; however, all ranked responses were normalized to a common scale to mitigate errors that otherwise would have arisen in analysis.

  • Miscellaneous parsing of data from the vendor-supplied format into schema that simplify analysis workflows — for example, multi-select response data were delivered as many individual columns in the flat file report; these were transformed into a single array-like entry per question, optimal for the writing and execution time of queries and analysis code.

  • Aliasing of questions and categorical responses for short yet human-readable queries.

  • Redaction of personal identifiable information (PII), such as IP addresses logged by the survey vendor, or contact information volunteered by responders in the additional comments section of the questionnaire.

2.4.2: Narrative Response Data

A particular challenge to the data ingestion process was encountered in the overwhelming response to long text-based narrative questions. These questions took such forms as “Please describe an example of […]” or “Optionally, provide additional comments on […].” The questionnaire contained eight of these long-form questions. Among these, we received approximately 14,000 individual non-null responses, at an average of 26 words per response — or roughly 700 pages of narrative information.

To summarize these responses for use in these analyses, the narratives were assigned with categorical values corresponding to major topics. This task was partially automated through development of a custom Python code using the open-source Natural Language Toolkit (NLTK) package. NLTK is a leading platform for building programs to work with human language data and computational linguistics.

At a high level, the analytical approach involved labeling training data by manually reading and categorizing 15% of the responses for each question. In parallel, lists of non-overlapping keywords were initiated and thought to be indicative of each category. The training data were then analyzed in NLTK for word use frequency to generate additional predictive keywords based on a frequency threshold. The NLTK analysis included a Snowball (or “Porter2”) stemming algorithm to consider word roots only, as well as the dismissal of common English and geology-related stop words (e.g., “a,” “the,” “is,” etc.) expected to have no bearing on keyword-based categorization. Upon supervised determination of additional keywords, category codes were mapped to each response based on keyword presence.

The analysis resulted in automated categorization of 65–90% of responses per question. Remaining outliers were categorized manually, and predicted categories were spot assured to evaluate the accuracy of the automation. Internal reviewers were satisfied with the results of the NLTK approach, and thus the coded narrative data were incorporated with further statistical analysis (see Chapters 10 and 12). Additionally, robust pattern recognition tools were developed and implemented for parsing of dollar value ranges and other useful numerical figures from the narratives.

2.4.3: Geological Survey Cost Reporting Data

Similar to the questionnaire response data, the forms showing SGS and USGS reported cost data were organized into a second MS Access relational database. Here, data were systematically ingested from 49 individual SGS MS Excel reporting files and from a cost report furnished by the USGS. These data are captured in thematic tables, with a relational SGS ID, and included such attributes as: state vs. federal funding of agency mapping over the 1994–2019 project period; employee type distribution (geoscientist, administrative, etc.); existing state map coverages at various scales; and derivative map status and needs. The SGS and USGS cost data were augmented with web product view/download statistics (see Chapters 4, 5, and 7).

2.4.4: Metadata Documentation

All prompts, codes, aliases, and other data definitions were documented in the data dictionary tables within each database. This ensured that databases could be effectively explored by others as stand-alone products and facilitated queries of metadata alongside responses. This documentation also will be available on the repository listing page of the corresponding data release.

All working datasets, documentation, analysis products, and project management materials for this effort were maintained on a collaborative cloud storage service, with organization, version control, access control, and backups internally managed by the NBMG Geoscience Data Manager.

Both databases described above (Questionnaire Results and Survey Cost Reporting) are publicly available from AGI in parallel with this report. The data release includes Microsoft Access database (.accdb), DuckDB (.db), Comma Separated Value (.csv) and Apache Parquet (.parquet) file formats, with plain-text documentation. The release contains full response data minus any redacted PII.

2.5: References

Bhagwat, S., 2014, The Nevada Bureau of Mines: current and future benefits to the university, the state, and the region: Nevada Bureau of Mines and Geology, University of Nevada, Reno Special Publication 38, 64 p., https://pubs.nbmg.unr.edu/NBMG-current-andfuture-p/sp038.htm.

Bhagwat, S.B. and Berg, R.C., 1991, Benefits and costs of geologic mapping programs in Illinois: Case study of Boone and Winnebago Counties and its statewide applicability: Illinois State Geological Survey Circular 549, 40 p., https://archive.org/details/benefitscostsofg549bhag.

Bhagwat, S.B. and Ipe, V.C., 2000, Economic benefits of detailed geologic mapping to Kentucky: Illinois State Geological Survey Special Report 3, 39 p., https//www.ideals.illinois.edu/items/45200.

Capstone Class 7933, V-600, 2017, An economic impact analysis of the Indiana Geological Survey: School of Public & Environmental Affairs, Indiana University, 65 p., https://capstone.oneill.indiana.edu/reports/An%20Economic%20Impact%20Analysis%20of%20the%20Indiana%20Geological%20Survey.pdf.

Chiavacci, S.J., Shapiro, C.D., Pindilli, E.J., Casey, C.F., Rayens, M.K., Wiggins, A.T., Andrews, W.M, and Hahn, E.J., 2020, Economic valuation of health benefits from using geologic data to communicate radon risk potential: Environmental Health, https://doi.org/10.1186/s12940-020-00589-8.

Garcia-Cortes, A., Vivancos, J., and Fernández-Gianotti. J., 2005, Economic and social value of the MAGNA plan: Boletin Geologico y Minero, v. 116, no. 4, p. 291–305, https://www.researchgate.net/publication/286984130_Economic_and_social_value_of_the_MAGNA_Plan.

Häggquist, E. and Söderholm, P., 2015, The economic value of geological information: synthesis and directions for future research: Resources Policy, v. 43, p. 91–100, https://www.sciencedirect.com/science/article/pii/S0301420714000804.

Kleinhenz & Associates, 2011, An economic impact analysis of the Ohio Geological Survey’s products and services: Kleinhenz & Associates, 25 p., https://www.kleinhenzassociates.com/wp-content/uploads/2015/12/EIA-Full-Report.pdf.

Lizzuo, C., Bartels, A., Brands, C.C., and Yashi, A., 2019, Arizona Geological Survey economic impact report: Arizona Geological Survey Contributed Report OFR-19-A, 21 p., https://azgs.arizona.edu/publication/arizona-geological-survey-economic-impact-report.

Malnik, J., 2023, Survey Benchmarks: What’s a good survey response rate? https://www.xola.com/articles/survey

Wu, M.-J., Zhao, K., and Fils-Aime, F., 2022, Response rates of online surveys in published research: A meta-analysis: Computers in Human Behavior Reports. V. 7, 11 p., https://www.sciencedirect.com/science/article/pii/S2451958822000409.