Active funding opportunity

This document is the current version.

NSF 22-519: Internet Measurement Research: Methodologies, Tools, and Infrastructure (IMR)

Program Solicitation

Document Information

Document History

  • Posted: October 27, 2021

Program Solicitation NSF 22-519

NSF Logo

National Science Foundation

Directorate for Computer and Information Science and Engineering
     Division of Computer and Network Systems
     Division of Computing and Communication Foundations
     Office of Advanced Cyberinfrastructure

Directorate for Mathematical and Physical Sciences
     Division of Mathematical Sciences

Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

     February 15, 2022

Track 1

     March 08, 2022

Track 2

     March 22, 2022

Track 3

     February 15, 2023

Track 1

     March 08, 2023

Track 2

Important Information And Revision Notes

Innovating and migrating proposal preparation and submission capabilities from FastLane to Research.gov is part of the ongoing NSF information technology modernization efforts, as described in Important Notice No. 147. In support of these efforts, research proposals submitted in response to this program solicitation must be prepared and submitted via Research.gov or via Grants.gov, and may not be prepared or submitted via FastLane.

Any proposal submitted in response to this solicitation should be submitted in accordance with the revised NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 22-1), which is effective for proposals submitted, or due, on or after October 4, 2021.

Summary Of Program Requirements

General Information

Program Title:

Internet Measurement Research: Methodologies, Tools, and Infrastructure (IMR)

Synopsis of Program:

With this solicitation, the National Science Foundation's (NSF) Directorate for Computer and Information Science and Engineering (CISE), in partnership with the Directorate of Mathematical and Physical Sciences (MPS), is launching a new, focused program to support methodologies, tools, and research infrastructure for Internet measurement spanning access (both wireless and fixed broadband) and core Internet. Currently, Internet measurement is conducted in a piecemeal and uncoordinated manner, and the infrastructure to collect, share, and process the data does not include data on all aspects of the network covering both wireless and wired Internet. The scope, complexity, and means of accessing the Internet have changed dramatically throughout its existence. Internet measurement work has mostly focused on the wired core networks for which existing Internet measurement repositories/infrastructure provides yeomen service. Methods, data collection, and data sharing have not kept up with the importance and proliferation of wireless and fixed broadband access networks. This leaves important aspects of the access network (both wireless and fixed broadband) in many geographic regions unmeasured or under-measured. With citizens now using cellular phones for accessing the Internet, more accurate and/or additional ways to measure and assess performance, connectivity, network topology, and service gap have also become necessary. The goal of the IMR program is to encourage, coordinate, and connect research in Internet measurement in a comprehensive manner. Such research is essential and timely to assess the health of the Internet more comprehensively, improve network technologies and systems, and develop new methods of networking, especially with the development of new methodologies and tools.

The IMR program will support three award tracks:

  • Track 1: Methodologies and Methods (MM) track awards support the creation of new methods for collecting, anonymizing, modeling, and analyzing Internet measurement data. The award track will support three subtracks. The first subtrack is statistical methodologies, with awards supporting the creation of new stochastic models and statistical methodologies for Internet measurement research, such as methodologies that adjust for non-representative data (e.g., data imputation), provide accurate results despite limited or sparse data, or methodologies that support new ways to analyze Internet measurement data. The second subtrack is privacy-preserving methodologies, with awards supporting innovative techniques or methodologies to ensure privacy protection during collection, sharing and analysis of Internet measurement data. The third subtrack is other methodologies, with awards supporting the creation of new Internet measurement methodologies, analyses, or post-processing not covered by the first two subtracks, such as improving the footprint of current data collection methods (e.g., to access networks), methods for measuring IPv6 address space, or integrating the measurement of access (both wireless and fixed broadband) and core networks.
  • Track 2: Measurement Tool Development and Demonstration (MT) track awards support the creation and deployment of new tools to collect Internet measurement data. These tools may be based on methodologies such as those supported by Track 1 or other prior work. These tools should ultimately be publicly available and include both active and passive measurement tools.
  • Track 3: Internet Measurement Related Infrastructure-Planning (RI-P) track awards will support the creation of infrastructure for hosting measurement tools and data. The funded infrastructure will make data available to the research community, including curating the data, ensuring an appropriate level of privacy protection, and developing necessary exchange formats, tools, and mechanisms. This infrastructure will eventually host tools and data, integrating the outcomes from Track 2. Currently in this track, planning proposals are sought towards such an infrastructure, while a separate future solicitation will likely be issued later with further details for a full infrastructure proposal submission, which is expected in FY 24 for Track 3, subject to the availability of funds.

The methodologies developed in Track 1 will help facilitate sharing or analysis of the data and will eventually be stored in the infrastructure sought through Track 3 in a privacy-preserving manner and will lead to innovative tools for data collections developed in Track 2. Tools developed in Track 2 may lead to curated data being added to the infrastructure identified through Track 3. Thus, the infrastructure sought through Track 3 will have multiple purposes. The integration of outcomes of the three tasks should help synthesize a more holistic or comprehensive approach to Internet measurement across different components of the Internet including core, access, wired, and wireless networks, considering security and privacy implications.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

  • Deepankar (Deep) Medhi, Program Director, CISE/CNS, telephone: (703) 292-2935, email: dmedhi@nsf.gov
  • Ann C. Von Lehmen, Program Director, CISE/CNS, telephone: (703) 292-4756, email: avonlehm@nsf.gov
  • Darleen L. Fisher, Program Director, CISE/CNS, telephone: (703) 292-8950, email: dlfisher@nsf.gov
  • Alexander Sprintson, Program Director, CISE/CNS, telephone: (703) 292-8950, email: asprints@nsf.gov
  • Murat Torlak, Program Director, CISE/CNS, telephone: (703) 292-7748, email: mtorlak@nsf.gov
  • Daniela Oliveira, Program Director, CISE/CNS, telephone: (703) 292-4352, email: doliveir@nsf.gov
  • James Joshi, Program Director, CISE/CNS, telephone: (703) 292-8950, email: jjoshi@nsf.gov
  • Funda Ergun, Program Director, CISE/CCF, telephone: (703) 292-2216, email: fergun@nsf.gov
  • Robert Beverly, Program Director, CISE/OAC, telephone: (703) 292-7068, email: rbeverly@nsf.gov
  • Kevin L. Thompson, Program Director, CISE/OAC, telephone: (703) 292-4220, email: kthompso@nsf.gov
  • Edsel A. Pena, Program Director, MPS/DMS, telephone: (703) 292-8080, email: epena@nsf.gov

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.049 --- Mathematical and Physical Sciences
  • 47.070 --- Computer and Information Science and Engineering

Award Information

Anticipated Type of Award: Standard Grant or Continuing Grant

Estimated Number of Awards: 26

Anticipated number, duration, and size of new awards:

Track 1: Methodologies and Methods (MM)

  • Number of awards: Approximately 11
  • Project length: 3 to 4 years
  • Award size: Up to $600,000

Track 2: Measurement Tool Development and Demonstration (MT)

  • Number of awards: Approximately 11
  • Project length: 2 years
  • Award size: Up to $600,000

Track 3: Internet Measurement Related Infrastructure-Planning (RI-P)

  • Number of awards: Approximately 4
  • Project length: 1 year
  • Award size: Up to $100,000

Anticipated Funding Amount: $14,000,000

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds and quality of proposals received.

Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research labs, professional societies and similar organizations in the U.S. associated with educational or research activities.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or co-PI:

There are no restrictions or limits.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not required
  • Preliminary Proposal Submission: Not required
  • Full Proposals:

B. Budgetary Information

  • Cost Sharing Requirements:

    Inclusion of voluntary committed cost sharing is prohibited.

  • Indirect Cost (F&A) Limitations:

    Not Applicable

  • Other Budgetary Limitations:

    Not Applicable

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

         February 15, 2022

    Track 1

         March 08, 2022

    Track 2

         March 22, 2022

    Track 3

         February 15, 2023

    Track 1

         March 08, 2023

    Track 2

Proposal Review Information Criteria

Merit Review Criteria:

National Science Board approved criteria. Additional merit review criteria apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:

Additional award conditions apply. Please see the full text of this solicitation for further information.

Reporting Requirements:

Standard NSF reporting requirements apply.

I. Introduction

The National Science Foundation's (NSF) Directorate for Computer and Information Science and Engineering (CISE), in partnership with the Directorate of Mathematical and Physical Sciences (MPS), is launching a new, focused program to support research infrastructure, methods, and tools for Internet measurement. The goal of the IMR program is to encourage, coordinate, and connect research in Internet measurement. The need for such a program was elucidated by COVID-19 network performance measurement studies[1], a series of CISE Internet Measurement workshops[2], the associated workshop report[3], and responses to an NSF Request for Information[4].

With the COVID-19 pandemic, NSF funded several measurement studies to understand the impact of the pandemic on the Internet, especially with the significant increase in citizens working from home. These studies helped to identify gaps in Internet measurement, such as measurement research being conducted in a piecemeal and uncoordinated manner, limited by network access, and measurement techniques failing to provide a comprehensive picture of Internet performance, connectivity, and cybersecurity threats. The scope, complexity, and means of accessing the Internet have changed dramatically throughout its existence. Internet measurement work has mostly focused on the wired core networks for which existing Internet measurement repositories/infrastructure provides yeomen service. Internet measurement methods/tools, data collection, and data sharing have not kept up with the importance and proliferation of wireless and fixed broadband access networks. This leaves important aspects of the access network (both wireless and fixed broadband) in many geographic regions unmeasured or under-measured. With citizens now using cellular phones for accessing the Internet, more accurate and/or additional ways to measure and assess performance, connectivity, network topology, service gap, and cyber threats have also become necessary. There is also a need to integrate measurements from core and access networks, both wired and wireless. Integration of the wired and wireless aspects of the network are necessary to characterize and segment performance and the effects of potential changes on the Internet. The current disconnected research undermines the community's ability to assess the health of the Internet, improve existing networking technologies, and develop new networking technologies at multiple levels in a comprehensive manner for both access and core networks. Furthermore, there is a lack of understanding of longitudinal behavior and trends due to the limited duration of measurement windows.

Various barriers challenge Internet measurement research[3]. Current data is often unavailable or owned by commercial network service providers. When available, such data from providers may lack adequate granularity and detail for the research community and for understanding a more fine-grained impact. Commercially owned data may require significant negotiation to access, and result in questions about objectivity due to the involvement of a commercial firm. Collected data may be skewed, for example, due to disproportionate levels of data collection in particular locations or differences in measurement compliance. Measurement data may also contain personally identifying information (PII) or other private data, including sensitive corporate data, resulting in privacy concerns, and restricting data sharing. The research community has recognized the need for studies to characterize important network properties, to collect and share longitudinal data, and to ensure privacy protection during data collection, sharing, and analysis[3,4].

This solicitation has been informed by the findings of NSF-funded Internet Measurement workshops that took place in January and April of 2021 [2]. This solicitation will fund projects in three tracks: Methodologies and Methods (Track 1, MM), Measurement Tool Development and Deployment (Track 2, MT), and Internet Measurement Related Infrastructure-Planning (Track 3, RI-P). Track 1 (MM) will fund projects that create new methods for collecting, anonymizing, modeling, and analyzing Internet measurement data. Track 2 (MT) will fund projects that support the creation and deployment of new tools to collect Internet measurement data. Track 3 (RI-P) seeks planning proposals for the creation of an Internet measurement infrastructure for hosting measurement tools and data.


[1] NSF awarded COVID-19 RAPID Internet measurement projects.

[2] Workshop on Overcoming Measurement Barriers to Internet Research (WOMBIR 2021).

[3] Claffy, K.C., D. Clark, F.E. Bustamante, J. Heidemann, M. Jonker, A. Schulman, E. Zegura. 2021. Workshop on Overcoming Measurement Barriers to Internet Research (WOMBIR 2021): Final Workshop Report in ACM SIGCOMM Computer Communication Review, 2021.

[4] Martonosi. M. 2021. Dear Colleague Letter NSF 21-056: Request for Information on the specific needs for datasets to conduct research on computer and network systems.

II. Program Description

The goal of the IMR program is to encourage, coordinate, and connect research in Internet measurement.

Recognizing that Internet measurement research needs a more comprehensive infrastructure for hosting tools and data, new and improved methodologies for data collection, sharing, and analysis with appropriate privacy protection, and new measurement tools across access and core networks, this solicitation invites proposals for three "Tracks." Track 1 proposals, Methodologies and Methods (MM), will advance the state of art in approaches for data collection, analysis, and/or privacy preserving sharing. Track 2 proposals, Measurement Tool development and demonstration (MT), will develop and deploy new measurement tools. Track 3 proposals, Internet measurement-related infrastructure-planning (RI-P), will eventually focus on creating the necessary community infrastructure to host and share data, considering privacy implications, as well as measurement tools with a separate solicitation likely to be issued in FY24, while in this solicitation planning proposals are sought for Track 3.

Together, these three Tracks will encourage, coordinate, and connect research in Internet measurement. The methodologies developed in Track 1 will help facilitate sharing or analysis of the data stored in the infrastructure sought via Track 3 in a privacy-preserving manner and will aid in developing innovative tools for data collections developed in Track 2. Tools developed in Track 2 may lead to curated data being added to the infrastructure identified through Track 3. Thus, the eventual infrastructure sought through Track 3 will have multiple purposes. The integration of outcomes of the three tasks should help synthesize a more holistic or comprehensive approach to Internet measurement across different components of the Internet including core, access, wired, and wireless networks.

Proposals specifically responding to this IMR program solicitation must follow the guidance provided here while also being responsive to the program's goals above.

PROJECT TRACKS

This program solicitation offers three tracks: Track 1: Methodologies and Methods (MM); Track 2: Measurement Tool Development and Demonstration (MT); and Track 3: Internet Measurement-Related Infrastructure-Planning (RI-P).

Track 1: Methodologies and Methods (MM) (which includes three subtracks)

Subtrack 1A - Statistical Methodologies: This subtrack is intended to develop new stochastic models and statistical methodologies for Internet measurement data from fixed, wireless, or core Internet. Such models and methodologies may include, but are not restricted to:

  • Statistical methodologies (including experiment design) to de-bias skewed Internet measurements (such as might be due to sparsity of collected data, lack of geographic representation, crowd sourced data);
  • Statistical sampling methods to obtain holistic measurements of the Internet in different dimensions for a better sense of the health of the Internet at various levels;
  • Models that allow for the normalization of collected data or extrapolation from collected data;
  • Internet data modeling at different frequencies including high-dimension high-frequency and mixed frequency data analysis;
  • Longitudinal studies of Internet measurements;
  • Statistical analysis, such as change-point and regression analysis of Internet multi-streaming data; and
  • Assessment of the validity of crowd-sourced datasets.

Collaborations between Internet measurement and statistics researchers are strongly encouraged.

Subtrack 1B - Privacy-Preserving Methodologies: This subtrack is intended to improve Internet measurement data collection and dissemination by addressing privacy concerns in the data lifecycle. Proposals in this subtrack may include, though are not limited to:

  • Collection methodologies that de-identify data at the outset;
  • Creation of methods to de-identify existing data;
  • Creation of anonymization methods which do not compromise data quality and utility;
  • Creation of methods to access and/or share collected data in a privacy-preserving manner; and
  • Privacy preserving analysis of Internet data.

Collaborations between Internet measurement, security, and privacy researchers are strongly encouraged.

Subtrack 1C - Other Methodologies: This subtrack is intended to support the creation of new Internet measurement methodologies, analyses, or post-processing, not covered by the first two subtracks. Proposals to this subtrack may include, but are not limited to:

  • Reduction of the footprint or improving the efficiency of current data collection methods;
  • Integration of the measurement of core and access networks with the ability to characterize performance by segment;
  • AI or ML-based methodologies for Internet measurement;
  • Measurement methods for the IPv6 address space; and
  • Examination of the extent to which cross layer specifications or measurements can be rendered useful (e.g., propagation maps, signal-related measurements), and assessment of whether these are applicable to mobile Internet performance.

Methodologies and Methods (MM) proposals can have total budgets of up to $600,000 for up to 4 years. Proposals including novel methodologies are encouraged. Proposals in this track may also consider identifying useful composable metrics, passive metrics that could be exposed and used in tandem with active tests, or any passive and active metrics that could be combined into assessing a particular situation.

Track 2: Internet Measurement Tool Development and Demonstration (MT)

This track is intended to support the development of deployable Internet measurement data collection tools that others can use (e.g., the research community and citizen scientists). Such tools may collect data at different levels, including but not limited to: core Internet, mobile Internet (that can be downloaded to a mobile phone), hand-held devices, laptops/desktops, or tools that could be deployed by ISPs. Such tools may be used for collecting active or passive measurement data. Proposals may also include tools that can hide some of the collected information due to privacy concerns. Proposals should seek to demonstrate the application of the tool in a particular environment, and to make the tool publicly available, while at least one version of the tool is expected to be made available through an open-source license. The process of development of tools should consider privacy and security aspects, including unintended harms of the proposed data collection mechanism. Finally, tools developed through this track should be accessible through Track 3 infrastructure.

Internet Measurement Tool Development and Demonstration (MT) proposals can have total budgets of up to $600,000 for 2 years. For Year 3, the PI team may request an integration supplement of up to $100,000 to work with the Track 3 infrastructure provider so that the tools can be made available through the platform. This supplement may not be requested before 18 months into the project. The proposers in this track should already have a clear idea of what the tool would be like at the time of the submission of the proposal; the proposers should consider hiring professional staff to ensure that a high-quality, well-tested tool is deployed by the end-date of the award.

Track 3: Internet Measurement-Related Infrastructure-Planning (RI-P)

This track invites planning grants for a future Internet Measurement-Related Infrastructure (RI) that might be funded in the year 2024. The description below presents the desired characteristics of such an RI to help proposers decide on the appropriate activities for the RI-P planning grant proposal. Receipt of a planning grant is not a requirement for applying to the future RI solicitation, though planning grant proposals are strongly encouraged for those interested in building such an infrastructure.

The RI is intended to support comprehensive infrastructure development to host data and measurement tools to investigate Internet performance, connectivity, network topology, and service gap. Such infrastructure should provide an interface for researchers to submit data for use by other researchers, curate the data, and ensure the data can be accessed in a way that is efficiently utilized in a privacy-preserving manner. The RI proposers are expected to leverage and coordinate with existing Internet measurement repositories/infrastructure to avoid duplication of functionalities, and focus on addressing gaps, especially in wireless and fixed broadband access Internet. Proposers are also expected to have a plan to integrate the outcomes of the other tracks of the IMR program into this infrastructure. Some aspects of the tasks required to achieve these goals may require research and development. RI proposers are also expected to consider the sustainability of the infrastructure.

A successful Internet Measurement-Related Infrastructure is expected to serve as a repository for both tools and data, to curate data, to utilize privacy-preserving mechanisms, to facilitate the sharing of longitudinal data, and to develop data exchange formats (such as a temporal markup language for data exchange) or tools (such as data retrieval APIs) to facilitate data sharing and usage. Proposers for such an infrastructure are expected to identify and target the gaps in existing Internet measurement repositories/infrastructures which their proposed infrastructure will address and complement. The goal is to be able to have an integrated view of wireless and wired Internet through this infrastructure. The proposed infrastructure should plan to develop tools for researchers to access/retrieve data while ensuring that data is able to be easily accessed, retrieved, and used by other researchers. Such infrastructure will likely need to provide ways to visualize information, for core networks and wireless access networks. The infrastructure should ensure that data archiving is available for both access (fixed and wireless) and core network measured data, and active and passive measurement data, that is not covered by existing repositories/infrastructure. For data that cannot be made publicly available (e.g., due to PII), the infrastructure should provide functionalities for vetted researchers to access data in a privacy-preserving manner, such as under an institution or university-mediated agreement to conduct research, especially for research reproducibility and repeatability. The infrastructure proposal should also properly plan for storage needs and the future sustainability of the infrastructure. The infrastructure is expected to work with developers of tools in Track 2 to ensure that the data collection can be integrated into the platform. Besides the privacy aspects discussed above, the development of the infrastructure should consider security aspects, including resilience to external attacks, access control, and the security of the data housed by the infrastructure.

The RI-P planning awards are for up to $100,000 for a duration up to a year. It is anticipated that the eventual Internet Measurement-Related Infrastructure (RI) project might receive funding in the range of $5,000,000 to $10,000,00 for up to 5 years.

III. Award Information

Anticipated Type of Award: Continuing Grant or Standard Grant

Estimated Number of Awards: 26

Anticipated number, duration, and size of new awards:

Track 1: Methodologies and Methods (MM)

  • Number of awards: Approximately 11
  • Project length: 3 to 4 years
  • Award size: Up to $600,000

Track 2: Measurement Tool Development and Demonstration (MT)

  • Number of awards: Approximately 11
  • Project length: 2 years
  • Award size: Up to $600,000

Track 3: Internet Measurement Related Infrastructure-Planning (RI-P)

  • Number of awards: Approximately 4
  • Project length: 1 year
  • Award size: Up to $100,000

Anticipated Funding Amount: $14,000,000

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds and quality of proposals received.

IV. Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.
  • Non-profit, non-academic organizations: Independent museums, observatories, research labs, professional societies and similar organizations in the U.S. associated with educational or research activities.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or co-PI:

There are no restrictions or limits.

Additional Eligibility Info:

For track 1, non-lead collaborating organizations may be submitted as collaborative proposals or as subawardees. For tracks 2 and 3, support for non-lead collaborating organizations should be requested as subawards. For these tracks, separately submitted collaborative proposals are not allowed. Subawardee institutions are subject to the same eligibility restrictions as those noted above.

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Research.gov or Grants.gov.

  • Full Proposals submitted via Research.gov: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal and Award Policies and Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov. The Prepare New Proposal setup will prompt you for the program solicitation number.
  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via Research.gov. PAPPG Chapter II.D.3 provides additional information on collaborative proposals.

See PAPPG Chapter II.C.2 for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the PAPPG instructions.

Collaborative proposals with separate submissions from multiple organizations are only allowed for Track 1 (MM) proposals. For track 1, non-lead collaborating organizations may be submitted as collaborative proposals or as subawardees. PAPPG Chapter II.D.3 provides additional information on collaborative proposals.

For Track 2 (MT) and Track 3 (RI-P): support for non-lead collaborating organizations should be requested as subawards by the lead organization submitting the proposal. For these tracks, separately submitted collaborative proposals are not allowed.

Note that the Track 3: Internet Measurement-Related Infrastructure-Planning proposals described in this solicitation are a solicitation-specific project category and are separate and distinct from the type of proposal described in Chapter II.E.1 of the PAPPG. When preparing a Track 3 Planning proposal in response to this solicitation, the "Research" type of proposal should be selected.

Proposal Titles:

Proposal titles must begin with IMR followed by a colon, then the Track acronym (e.g., MM, MT, RI-P) followed by a colon. For separate subtracks in Track 1, add 1A, 1B, or 1C with MM, followed by colon, and then the title of the project. For example, if you are submitting a proposal for a Track 3 planning project, then the title of the proposal would be IMR: RI-P: Title. If you are submitting a proposal in Subtrack 1A in Track 1, then the title of the proposal would be IMR: MM-1A: Title. If a proposal is submitted as part of a set of collaborative proposals, the title of the proposal must begin with Collaborative Research followed by a colon. For example, if you are submitting a collaborative set of proposals for Subtrack 1B in Track 1, then the title of each collaborative proposal would be Collaborative Research: IMR: MM-1B: Title. Please note that if submitting via Research.gov, the system will automatically insert the prepended title "Collaborative Research" when the collaborative set of proposals is created.

Project Summary:

At the beginning of the Overview section of the Project Summary, enter the title of the project, the name of the PI, and the lead organization. The Project Summary consists of an overview, a statement on the intellectual merit of the proposed activity, and a statement on the broader impacts of the proposed activity. Provide 3-5 high-level keyword descriptors for the project at the end of the Overview section of the Project Summary.

Project Description:

There is a 15-page limit for the Project Description for Track 1 and 2 proposals, and a 10-page limit for the Program Description for Track 3 planning proposals. Proposals that exceed this limit will be returned without review.

Supplementary Documents:

In the Supplementary Documents Section, upload the following:

(1) A list of Project Personnel and Partner Institutions:

Provide current, accurate information for all personnel and institutions involved in the project. NSF staff will use this information in the merit review process to manage reviewer selection. The list should include all PIs, co-PIs, Senior Personnel, paid/unpaid Consultants or Collaborators, Subawardees, and Postdocs. This list should be numbered and include (in this order) Full name, Organization(s), and Role in the project, with each item separated by a semi-colon. Each person listed should start a new numbered line. For example:

  • Mary Smith; XYZ University; PI
  • John Jones; University of PQR; Senior Personnel
  • Jane Brown; XYZ University; Postdoc
  • Bob Adams; ABC Community College; Paid Consultant
  • Susan White; DEF Corporation; Unpaid Collaborator
  • Tim Green; ZZZ University; Subawardee

(2) A List of Letters of Collaboration:

Provide names and affiliations of each individual who provided a letter of collaboration (any track) included in the next item. This should be on a separate page that follows the above list.

(3) Documentation of collaborative arrangements of significance to the proposal through Letters of Collaboration [See PAPPG Chapter II.C.2.d.(iv)]:

Letters of collaboration should be limited to stating the intent to collaborate and should not contain endorsements or evaluation of the proposed project.

(4) Collaboration Plans:

Note: In collaborative proposals, the lead institution should provide this information for all participants.

Since the success of collaborative research efforts are known to depend on thoughtful coordination mechanisms that regularly bring together the various participants of the project, all proposals submitted to this solicitation must include a Collaboration Plan of up to 2 pages if more than one PI is named in the project. The length of and degree of detail provided in the Collaboration Plan should be commensurate with the complexity of the proposed project. Where appropriate, the Collaboration Plan might include: 1) the specific roles of the project participants in all organizations involved; 2) information on how the project will be managed across all the investigators, institutions, and/or disciplines; 3) identification of the specific coordination mechanisms that will enable cross-investigator, cross-institution, and/or cross-disciplinary scientific integration (e.g., yearly workshops, graduate student exchange, project meetings at conferences, use of the grid for videoconferences, software repositories, etc.); and 4) specific references to the budget line items that support collaboration and coordination mechanisms.

If a project with more than one PI does not include a Collaboration Plan of up to 2 pages, that proposal will be returned without review.

(5) Data Management Plan (required):

Proposals must include a Supplementary Document of no more than two pages labeled "Data Management Plan." This supplementary document should describe how the proposal will conform to NSF policy on the dissemination and sharing of research results.

See Chapter II.C.2.j of the PAPPG for full policy implementation.

For additional information on the Dissemination and Sharing of Research Results, see: https://www.nsf.gov/bfa/dias/policy/dmp.jsp.

For specific guidance for Data Management Plans submitted to the Directorate for Computer and Information Science and Engineering (CISE) see: https://www.nsf.gov/cise/cise_dmp.jsp.

B. Budgetary Information

Cost Sharing:

Inclusion of voluntary committed cost sharing is prohibited.

C. Due Dates

Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

     February 15, 2022

Track 1

     March 08, 2022

Track 2

     March 22, 2022

Track 3

     February 15, 2023

Track 1

     March 08, 2023

Track 2

D. Research.gov/Grants.gov Requirements

For Proposals Submitted Via Research.gov:

To prepare and submit a proposal via Research.gov, see detailed technical instructions available at: https://www.research.gov/research-portal/appmanager/base/desktop?_nfpb=true&_pageLabel=research_node_display&_nodePath=/researchGov/Service/Desktop/ProposalPreparationandSubmission.html. For Research.gov user support, call the Research.gov Help Desk at 1-800-673-6188 or e-mail rgov@nsf.gov. The Research.gov Help Desk answers general technical questions related to the use of the Research.gov system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

For Proposals Submitted Via Grants.gov:

Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: https://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

Proposers that submitted via Research.gov may use Research.gov to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF Proposal Processing And Review Procedures

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in PAPPG Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/bfa/dias/policy/merit_review/.

Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in Building the Future: Investing in Discovery and Innovation - NSF Strategic Plan for Fiscal Years (FY) 2018 – 2022. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

One of the strategic objectives in support of NSF's mission is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. NSF's contribution to the national innovation ecosystem is to provide cutting-edge research under the guidance of the Nation's most creative scientists and engineers. NSF also supports development of a strong science, technology, engineering, and mathematics (STEM) workforce by investing in building the knowledge that informs improvements in STEM teaching and learning.

NSF's mission calls for the broadening of opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (PAPPG Chapter II.C.2.d(i). contains additional information for use by proposers in development of the Project Description section of the proposal). Reviewers are strongly encouraged to review the criteria, including PAPPG Chapter II.C.2.d(i), prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and other underrepresented groups in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

Additional Solicitation Specific Review Criteria

Within the context of the Intellectual Merit and Broader Impacts criteria, reviewers will be asked to consider the following issues when preparing their reviews:

For Track 1 (MM) proposals:

As noted below, the review criteria will vary depending on whether the project is submitted to subtrack 1A, 1B, or 1C.

  • How does the proposed methodology fill a current gap? (1A, 1B, 1C)
  • Significance in Internet Measurement: Does the project address a significant problem in the area of Internet measurement? Is the prior research supporting the proposed project rigorous? (1A, 1B, 1C)
  • Statistical Innovation: Are the proposed statistical methods innovative? Does the project apply existing statistical methodologies to a new situation? If yes, how challenging are such applications? (1A)
  • Integration: How well do the proposed statistical methods integrate with the intended Internet measurement problems? How well does the project develop new statistical methodologies that can apply to Internet Measurement? (1A)
  • Privacy-preserving innovation: How well do the proposed privacy protection methods integrate with the intended Internet measurement privacy problems? How well does the project develop new privacy protection methodologies that can apply to Internet Measurement? (1B)
  • Integration and Utility: How well will the proposed methods integrate with the intended Internet measurement problems? To what extent do the approaches proposed preserve data utility? (1A, 1B, 1C)
  • Are security and privacy aspects being considered in the design, implementation and evaluation of the proposed methods, methodologies, approaches, and tools? (1A, 1C)

For Track 2 (MT) proposals:

  • How broadly applicable is the tool expected to be?
  • Will the tool provide a new method for collecting Internet Measurement data?
  • If the tool is developed for end-users, how will the tool provide privacy for data collection?
  • Will the tool be made publicly deployable and include plans for an open-source license? If so, how?
  • Are security and privacy aspects being considered in the design, implementation and evaluation of the proposed methods, methodologies, approaches, and tools?

Track 3 (RI-P) planning proposals:

  • Is there a vision for an innovative infrastructure project that could lead to advancing Internet Measurement research, as described in this solicitation?
  • Is there a compelling plan of activities presented to develop a realistic project management and execution plan for the eventual infrastructure and associated services, tools and resources?
  • Does the proposed team have the expertise and leadership needed to lead a community effort and help shape the resource to meet community needs?

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-8134 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Special Award Conditions:

Track 2 Internet Measurement Tool awards will require at least one version of the tool be made available through an open-source license.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

VIII. Agency Contacts

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

  • Deepankar (Deep) Medhi, Program Director, CISE/CNS, telephone: (703) 292-2935, email: dmedhi@nsf.gov
  • Ann C. Von Lehmen, Program Director, CISE/CNS, telephone: (703) 292-4756, email: avonlehm@nsf.gov
  • Darleen L. Fisher, Program Director, CISE/CNS, telephone: (703) 292-8950, email: dlfisher@nsf.gov
  • Alexander Sprintson, Program Director, CISE/CNS, telephone: (703) 292-8950, email: asprints@nsf.gov
  • Murat Torlak, Program Director, CISE/CNS, telephone: (703) 292-7748, email: mtorlak@nsf.gov
  • Daniela Oliveira, Program Director, CISE/CNS, telephone: (703) 292-4352, email: doliveir@nsf.gov
  • James Joshi, Program Director, CISE/CNS, telephone: (703) 292-8950, email: jjoshi@nsf.gov
  • Funda Ergun, Program Director, CISE/CCF, telephone: (703) 292-2216, email: fergun@nsf.gov
  • Robert Beverly, Program Director, CISE/OAC, telephone: (703) 292-7068, email: rbeverly@nsf.gov
  • Kevin L. Thompson, Program Director, CISE/OAC, telephone: (703) 292-4220, email: kthompso@nsf.gov
  • Edsel A. Pena, Program Director, MPS/DMS, telephone: (703) 292-8080, email: epena@nsf.gov

For questions related to the use of FastLane or Research.gov, contact:

  • FastLane and Research.gov Help Desk: 1-800-673-6188
  • FastLane Help Desk e-mail: fastlane@nsf.gov
  • Research.gov Help Desk e-mail: rgov@nsf.gov

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

IX. Other Information

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at https://www.grants.gov.

About The National Science Foundation

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See the NSF Proposal & Award Policies & Procedures Guide Chapter II.E.6 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

  • Location:

2415 Eisenhower Avenue, Alexandria, VA 22314

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:
 

Send an e-mail to:

nsfpubs@nsf.gov

or telephone:

(703) 292-8134

  • To Locate NSF Employees:

(703) 292-5111

Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See System of Record Notices, NSF-50, "Principal Investigator/Proposal File and Associated Records," and NSF-51, "Reviewer/Proposal File and Associated Records." Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Policy Office, Division of Institution and Award Support
Office of Budget, Finance, and Award Management
National Science Foundation
Alexandria, VA 22314