Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
The National Research Council's Army Research Laboratory Technical Assessment Board (ARLTAB) provides biennial assessments of the scientific and technical quality of the research, development, and analysis programs at the Army Research Laboratory, focusing on ballistics sciences, human sciences, information sciences, materials sciences, and mechanical sciences.This report discusses the biennial assessment process used by ARLTAB and its five panels; provides detailed assessments of each of the ARL core technical competency areas reviewed during the 2013-2014 period; and presents findings and recommendations common across multiple competency areas.
"Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges is an independent assessment regarding the transition of the National Nuclear Security Administration (NNSA) laboratories - Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories - to multiagency, federally funded research and development centers with direct sustainment and sponsorship by multiple national security agencies. This report makes recommendations for the governance of NNSA laboratories to better align with the evolving national security landscape and the laboratories' increasing engagement with the other national security agencies, while simultaneously encouraging the best technical solutions to national problems from the entire range of national security establishments. According to this report, the Department of Energy should remain the sole sponsor of the NNSA laboratories as federally funded research and development centers. The NNSA laboratories will remain a critically important resource to meet U.S. national security needs for many decades to come. The recommendations of Aligning the Governance Structure of the NNSA Laboratories to Meet 21st Century National Security Challenges will improve the governance of the laboratories and strengthen their strategic relationship with the non-DOE national security agencies."--
"In January 2014, the Board on Children, Youth, and Families of the Institute of Medicine and the National Research Council, in collaboration with the IOM Board on Global Health, launched the Forum on Investing in Young Children Globally. At this meeting, the participants agreed to focus on creating and sustaining, over 3 years, an evidence-driven community of stakeholders that aims to explore existing, new, and innovative science and research from around the world and translate this evidence into sound and strategic investments in policies and practices that will make a difference in the lives of children and their caregivers. Financing Investments in Young Children Globally is the summary of a workshop hosted by the Forum on Investing in Young Children Globally in August 2014. This workshop, on financing investments for young children, brought together stakeholders from such disciplines as social protection, nutrition, education, health, finance, economics, and law and included practitioners, advocates, researchers, and policy makers. Presentations and discussions identified some of the current issues in financing investments across health, education, nutrition, and social protection that aim to improve children's developmental potential. This report explores issues across three broad domains of financing: (1) costs of programs for young children; (2) sources of funding, including public and private investments; and (3) allocation of these investments, including cash transfers, microcredit programs, block grants, and government restructuring."
The mission of the Engineering Laboratory of the National Institute of Standards and Technology (NIST) is to promote U.S. innovation and industrial competitiveness through measurement science and standards for technology-intensive manufacturing, construction, and cyberphysical systems in ways that enhance economic prosperity and improve the quality of life. To support this mission, the Engineering Laboratory has developed thrusts in smart manufacturing, construction, and cyberphysical systems; in sustainable and energy-efficient manufacturing materials and infrastructure; and in disaster-resilient buildings, infrastructure, and communities. The technical work of the Engineering Laboratory is performed in five divisions: Intelligent Systems; Materials and Structural Systems; Energy and Environment; Systems Integration; and Fire Research; and two offices: Applied Economics Office and Smart Grid Program Office. An Assessment of the National Institute of Standards and Technology Engineering Laboratory Fiscal Year 2014 assesses the scientific and technical work performed by the NIST Engineering Laboratory. This report evaluates the organization's technical programs, portfolio of scientific expertise within the organization, adequacy of the organization's facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
The National Institute of Standards and Technology's (NIST's) Material Measurement Laboratory (MML) is our nation's reference laboratory for measurements in the chemical, biological, and materials sciences and engineering. Staff of the MML develop state-of-the-art measurement techniques and conduct fundamental research related to measuring the composition, structure, and properties of substances. Tools that include reference materials, data, and measurement services are developed to support industries that range from transportation to biotechnology and to address problems such as climate change, environmental sciences, renewable energy, health care, infrastructure, food safety and nutrition, and forensics. This report assesses the scientific and technical work performed by NIST's Material Measurement Laboratory. In particular, the report assesses the organization's technical programs, the portfolio of scientific expertise within the organization, the adequacy of the organization\'s facilities, equipment, and human resources, and the effectiveness by which the organization disseminates its program outputs.
"The 2012 National Research Council report Disaster Resilience: A National Imperative highlighted the challenges of increasing national resilience in the United States. One finding of the report was that "without numerical means of assessing resilience, it would be impossible to identify the priority needs for improvement, to monitor changes, to show that resilience had improved, or to compare the benefits of increasing resilience with the associated costs." Although measuring resilience is a challenge, metrics and indicators to evaluate progress, and the data necessary to establish the metric, are critical for helping communities to clarify and formalize what the concept of resilience means for them, and to support efforts to develop and prioritize resilience investments. One of the recommendations from the 2012 report stated that government entities at federal, state, and local levels and professional organizations should partner to help develop a framework for communities to adapt to their circumstances and begin to track their progress toward increasing resilience. To build upon this recommendation and begin to help communities formulate such a framework, the Resilient America Roundtable of the National Academies convened the workshop Measures of Community Resilience: From Lessons Learned to Lessons Applied on September 5, 2014 in Washington, D.C. The workshop's overarching objective was to begin to develop a framework of measures and indicators that could support community efforts to increase their resilience. The framework will be further developed through feedback and testing in pilot and other partner communities that are working with the Resilient America Roundtable. This report is a summary of the one-day workshop, which consisted of a keynote address and two panel sessions in the morning and afternoon breakout sessions that began the discussion on how to develop a framework of resilience measures."-- Publisher's description
Every year, the U.S. Army must select from an applicant pool in the hundreds of thousands to meet annual enlistment targets, currently numbering in the tens of thousands of new soldiers. A critical component of the selection process for enlisted service members is the formal assessments administered to applicants to determine their performance potential. Attrition for the U.S. military is hugely expensive. Every recruit that does not make it through basic training or beyond a first enlistment costs hundreds of thousands of dollars. Academic and other professional settings suffer similar losses when the wrong individuals are accepted into the wrong schools and programs or jobs and companies. Picking the right people from the start is becoming increasingly important in today's economy and in response to the growing numbers of applicants. Beyond cognitive tests of ability, what other attributes should selectors be considering to know whether an individual has the talent and the capability to perform as well as the mental and psychological drive to succeed? Measuring Human Capabilities: An Agenda for Basic Research on the Assessment of Individual and Group Performance Potential for Military Accession examines promising emerging theoretical, technological, and statistical advances that could provide scientifically valid new approaches and measurement capabilities to assess human capability. This report considers the basic research necessary to maximize the efficiency, accuracy, and effective use of human capability measures in the military's selection and initial occupational assignment process. The research recommendations of Measuring Human Capabilities will identify ways to supplement the Army's enlisted soldier accession system with additional predictors of individual and collective performance. Although the primary audience for this report is the U.S. military, this book will be of interest to researchers of psychometrics, personnel selection and testing, team dynamics, cognitive ability, and measurement methods and technologies. Professionals interested in of the foundational science behind academic testing, job selection, and human resources management will also find this report of interest.
Hurricane- and coastal-storm-related losses have increased substantially during the past century, largely due to increases in population and development in the most susceptible coastal areas. Climate change poses additional threats to coastal communities from sea level rise and possible increases in strength of the largest hurricanes. Several large cities in the United States have extensive assets at risk to coastal storms, along with countless smaller cities and developed areas. The devastation from Superstorm Sandy has heightened the nation's awareness of these vulnerabilities. What can we do to better prepare for and respond to the increasing risks of loss?Reducing Coastal Risk on the East and Gulf Coasts reviews the coastal risk-reduction strategies and levels of protection that have been used along the United States East and Gulf Coasts to reduce the impacts of coastal flooding associated with storm surges. This report evaluates their effectiveness in terms of economic return, protection of life safety, and minimization of environmental effects. According to this report, the vast majority of the funding for coastal risk-related issues is provided only after a disaster occurs. This report calls for the development of a national vision for coastal risk management that includes a long-term view, regional solutions, and recognition of the full array of economic, social, environmental, and life-safety benefits that come from risk reduction efforts. To support this vision, Reducing Coastal Risk states that a national coastal risk assessment is needed to identify those areas with the greatest risks that are high priorities for risk reduction efforts. The report discusses the implications of expanding the extent and levels of coastal storm surge protection in terms of operation and maintenance costs and the availability of resources.Reducing Coastal Risk recommends that benefit-cost analysis, constrained by acceptable risk criteria and other important environmental and social factors, be used as a framework for evaluating national investments in coastal risk reduction. The recommendations of this report will assist engineers, planners and policy makers at national, regional, state, and local levels to move from a nation that is primarily reactive to coastal disasters to one that invests wisely in coastal risk reduction and builds resilience among coastal communities.
"National Center for Science and Engineering Statistics (NCSES) of the National Science Foundation is responsible for national reporting of the research and development (R&D) activities that occur in all sectors of the United States economy. For most sectors, including the business and higher education sectors, NCSES collects data on these activities on a regular basis. However, data on R&D within the nonprofit sector have not been collected in 18 years, a time period which has seen dynamic and rapid growth of the sector. NCSES decided to design and implement a new survey of nonprofits, and commissioned this workshop to provide a forum to discuss conceptual and design issues and methods. Measuring Research and Development Expenditures in the U.S. Nonprofit Sector: Conceptual and Design Issues summarizes the presentations and discussion of the workshop. This report identifies concepts and issues for the design of a survey of R&D expenditures made by nonprofit organizations, considering the goals, content, statistical methodology, data quality, and data products associated with this data collection. The report also considers the broader usefulness of the data for understanding the nature of the nonprofit sector and their R&D activities. Measuring Research and Development Expenditures in the U. S. Nonprofit Sector will help readers understand the role of nonprofit sector given its enormous size and scope as well as its contribution to identifying new forms of R&D beyond production processes and new technology."--
"Measuring the Risks and Causes of Premature Death is the summary of two workshops conducted by The Committee on Population of the National Research Council at the National Academies to address the data sources, science and future research needs to understand the causes of premature mortality in the United States. The workshops reviewed previous work in the field in light of new data generated as part of the work of the NRC Panel on Understanding Divergent Trends in Longevity in High-Income Countries (NRC, 2011) and the NRC/IOM Panel on Understanding Cross-National Differences Among High-Income Countries (NRC/IOM, 2013). The workshop presentations considered the state of the science of measuring the determinants of the causes of premature death, assessed the availability and quality of data sources, and charted future courses of action to improve the understanding of the causes of premature death. Presenters shared their approaches to and results of measuring premature mortality and specific risk factors, with a particular focus on those factors most amenable to improvement through public health policy. This report summarizes the presentations and discussion of both workshops." --
"The American Community Survey (ACS) was conceptualized as a replacement to the census long form, which collected detailed population and housing data from a sample of the U.S. population, once a decade, as part of the decennial census operations. The long form was traditionally the main source of socio-economic information for areas below the national level. The data provided for small areas, such as counties, municipalities, and neighborhoods is what made the long form unique, and what makes the ACS unique today. Since the successful transition from the decennial long form in 2005, the ACS has become an invaluable resource for many stakeholders, particularly for meeting national and state level data needs. However, due to inadequate sample sizes, a major challenge for the survey is producing reliable estimates for smaller geographic areas, which is a concern because of the unique role fulfilled by the long form, and now the ACS, of providing data with a geographic granularity that no other federal survey could provide. In addition to the primary challenge associated with the reliability of the estimates, this is also a good time to assess other aspects of the survey in order to identify opportunities for refinement based on the experience of the first few years. Realizing the Potential of the American Community Survey provides input on ways of improving the ACS, focusing on two priority areas: identifying methods that could improve the quality of the data available for small areas, and suggesting changes that would increase the survey's efficiency in responding to new data needs. This report considers changes that the ACS office should consider over the course of the next few years in order to further improve the ACS data. The recommendations of Realizing the Potential of the American Community Survey will help the Census Bureau improve performance in several areas, which may ultimately lead to improved data products as the survey enters its next decade."--Publisher's description.
Over the past few decades there have been major successes in creating evidence-based interventions to improve the cognitive, affective, and behavioral health of children. Many of these interventions have been put into practice at the local, state, or national level. To reap what has been learned from such implementation, and to explore how new legislation and policies as well as advances in technology and analytical methods can help drive future implementation, the Institute of Medicine-National Research Council Forum on Promoting Children's Cognitive, Affective, and Behavioral Health held the workshop "Harvesting the Scientific Investment in Prevention Science to Promote Children's Cognitive, Affective, and Behavioral Health" in Washington, DC, on June 16 and 17, 2014. The workshop featured panel discussions of system-level levers and blockages to the broad implementation of interventions with fidelity, focusing on policy, finance, and method science; the role of scientific norms, implementation strategies, and practices in care quality and outcomes at the national, state, and local levels; and new methodological directions. The workshop also featured keynote presentations on the role of economics and policy in scaling interventions for children's behavioral health, and making better use of evidence to design informed and more efficient children's mental health systems. Harvesting the Scientific Investment in Prevention Science to Promote Children's Cognitive, Affective, and Behavioral Health summarizes the presentations and discussion of the workshop.
"Building Infrastructure for International Collaborative Research in the Social and Behavioral Sciences is the summary of a workshop convened by the National Research Council's Committee on International Collaborations in Social and Behavioral Sciences in September 2013 to identify ways to reduce impediments and to increase access to cross-national research collaborations among a broad range of American scholars in the behavioral and social sciences (and education), especially early career scholars. Over the course of two and a half days, individuals from universities and federal agencies, professional organizations, and other parties with interests in international collaboration in the behavior and social sciences and education made presentations and participated in discussions. They came from diverse fields including cognitive psychology, developmental psychology, comparative education, educational anthropology, sociology, organizational psychology, the health sciences, international development studies, higher education administration, and international exchange."--Publisher's description.
"Since the early 1960s, the U.S. strategic nuclear posture has been composed of a triad of nuclear-certified long-range bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles. Since the early 1970s, U.S. nuclear forces have been subject to strategic arms control agreements. The large numbers and diversified nature of the U.S. nonstrategic (tactical) nuclear forces, which cannot be ignored as part of the overall nuclear deterrent, have decreased substantially since the Cold War. While there is domestic consensus today on the need to maintain an effective deterrent, there is no consensus on precisely what that requires, especially in a changing geopolitical environment and with continued reductions in nuclear arms. This places a premium on having the best possible analytic tools, methods, and approaches for understanding how nuclear deterrence and assurance work, how they might fail, and how failure can be averted by U.S. nuclear forces. U.S. Air Force Strategic Deterrence Analytic Capabilities identifies the broad analytic issues and factors that must be considered in seeking nuclear deterrence of adversaries and assurance of allies in the 21st century. This report describes and assesses tools, methods - including behavioral science-based methods - and approaches for improving the understanding of how nuclear deterrence and assurance work or may fail in the 21st century and the extent to which such failures might be averted or mitigated by the proper choice of nuclear systems, technological capabilities, postures, and concepts of operation of American nuclear forces. The report recommends criteria and a framework for validating the tools, methods, and approaches and for identifying those most promising for Air Force usage."--Publisher's description.
The Science of Responding to a Nuclear Reactor Accident summarizes the presentations and discussions of the May 2014 Gilbert W. Beebe Symposium titled "The Science and Response to a Nuclear Reactor Accident". The symposium, dedicated in honor of the distinguished National Cancer Institute radiation epidemiologist who died in 2003, was co-hosted by the Nuclear and Radiation Studies Board of the National Academy of Sciences and the National Cancer Institute. The symposium topic was prompted by the March 2011 accident at the Fukushima Daiichi nuclear power plant that was initiated by the 9.0-magnitude earthquake and tsunami off the northeast coast of Japan. This was the fourth major nuclear accident that has occurred since the beginning of the nuclear age some 60 years ago. The 1957 Windscale accident in the United Kingdom caused by a fire in the reactor, the 1979 Three Mile Island accident in the United States caused by mechanical and human errors, and the 1986 Chernobyl accident in the former Soviet Union caused by a series of human errors during the conduct of a reactor experiment are the other three major accidents. The rarity of nuclear accidents and the limited amount of existing experiences that have been assembled over the decades heightens the importance of learning from the past. This year's symposium promoted discussions among federal, state, academic, research institute, and news media representatives on current scientific knowledge and response plans for nuclear reactor accidents. The Beebe symposium explored how experiences from past nuclear plant accidents can be used to mitigate the consequences of future accidents, if they occur. The Science of Responding to a Nuclear Reactor Accident addresses off-site emergency response and long-term management of the accident consequences; estimating radiation exposures of affected populations; health effects and population monitoring; other radiological consequences; and communication among plant officials, government officials, and the public and the role of the media.
In today's world, the range of technologies with the potential to threaten the security of U.S. military forces is extremely broad. These include developments in explosive materials, sensors, control systems, robotics, satellite systems, and computing power, to name just a few. Such technologies have not only enhanced the capabilities of U.S. military forces, but also offer enhanced offensive capabilities to potential adversaries - either directly through the development of more sophisticated weapons, or more indirectly through opportunities for interrupting the function of defensive U.S. military systems. Passive and active electro-optical (EO) sensing technologies are prime examples. Laser Radar considers the potential of active EO technologies to create surprise; i.e., systems that use a source of visible or infrared light to interrogate a target in combination with sensitive detectors and processors to analyze the returned light. The addition of an interrogating light source to the system adds rich new phenomenologies that enable new capabilities to be explored. This report evaluates the fundamental, physical limits to active EO sensor technologies with potential military utility; identifies key technologies that may help overcome the impediments within a 5-10 year timeframe; considers the pros and cons of implementing each existing or emerging technology; and evaluates the potential uses of active EO sensing technologies, including 3D mapping and multi-discriminate laser radar technologies.
The Evidence for Violence Prevention Across the Lifespan and Around the World is the summary of a workshop convened in January 2013 by the Institute of Medicine's Forum on Global Violence Prevention to explore value and application of the evidence for violence prevention across the lifespan and around the world. As part of the Forum's mandate is to engage in multisectoral, multidirectional dialogue that explores crosscutting approaches to violence prevention, this workshop examined how existing evidence for violence prevention can continue to be expanded, disseminated, and implemented in ways that further the ultimate aims of improved individual well-being and safer communities. This report examines violence prevention interventions that have been proven to reduce different types of violence (e.g., child and elder abuse, intimate partner and sexual violence, youth and collective violence, and self-directed violence), identifies the common approaches most lacking in evidentiary support, and discusses ways that proven effective interventions can be integrated or otherwise linked with other prevention programs.
"Many national initiatives in K-12 science, technology, engineering, and mathematics (STEM) education have emphasized the connections between teachers and improved student learning. Much of the discussion surrounding these initiatives has focused on the preparation, professional development, evaluation, compensation, and career advancement of teachers. Yet one critical set of voices has been largely missing from this discussion - that of classroom teachers themselves. To explore the potential for STEM teacher leaders to improve student learning through involvement in education policy and decision making, the National Research Council held a convocation in June 2014 entitled "One Year After Science's Grand Challenges in Education: Professional Leadership of STEM Teachers through Education Policy and Decision Making." This event was structured around a special issue of Science magazine that discussed 20 grand challenges in science education. The authors of three major articles in that issue - along with Dr. Bruce Alberts, Science's editor-in-chief at the time - spoke at the convocation, updating their earlier observations and applying them directly to the issue of STEM teacher leadership. The convocation focused on empowering teachers to play greater leadership roles in education policy and decision making in STEM education at the national, state, and local levels. Exploring Opportunities for STEM Teacher Leadership is a record of the presentations and discussion of that event. This report will be of interest to STEM teachers, education professionals, and state and local policy makers."
A high percentage of defense systems fail to meet their reliability requirements. This is a serious problem for the U.S. Department of Defense (DOD), as well as the nation. Those systems are not only less likely to successfully carry out their intended missions, but they also could endanger the lives of the operators. Furthermore, reliability failures discovered after deployment can result in costly and strategic delays and the need for expensive redesign, which often limits the tactical situations in which the system can be used. Finally, systems that fail to meet their reliability requirements are much more likely to need additional scheduled and unscheduled maintenance and to need more spare parts and possibly replacement systems, all of which can substantially increase the life-cycle costs of a system. Beginning in 2008, DOD undertook a concerted effort to raise the priority of reliability through greater use of design for reliability techniques, reliability growth testing, and formal reliability growth modeling, by both the contractors and DOD units. To this end, handbooks, guidances, and formal memoranda were revised or newly issued to reduce the frequency of reliability deficiencies for defense systems in operational testing and the effects of those deficiencies. Reliability Growth evaluates these recent changes and, more generally, assesses how current DOD principles and practices could be modified to increase the likelihood that defense systems will satisfy their reliability requirements. This report examines changes to the reliability requirements for proposed systems; defines modern design and testing for reliability; discusses the contractor's role in reliability testing; and summarizes the current state of formal reliability growth modeling. The recommendations of Reliability Growth will improve the reliability of defense systems and protect the health of the valuable personnel who operate them.
"The National Marine Fisheries Service (NMFS) is responsible for the stewardship of the nation's living marine resources and their habitat. As part of this charge, NMFS conducts stock assessments of the abundance and composition of fish stocks in several bodies of water. At present, stock assessments rely heavily on human data-gathering and analysis. Automatic means of fish stock assessments are appealing because they offer the potential to improve efficiency and reduce human workload and perhaps develop higher-fidelity measurements. The use of images and video, when accompanies by appropriate statistical analyses of the inferred data, is of increasing importance for estimating the abundance of species and their age distributions. Robust Methods for the Analysis of Images and Videos for Fisheries Stock Assessment is the summary of a workshop convened by the National Research Council Committee on Applied and Theoretical Statistics to discuss analysis techniques for images and videos for fisheries stock assessment. Experts from diverse communities shared perspective about the most efficient path toward improved automation of visual information and discussed both near-term and long-term goals that can be achieved through research and development efforts. This report is a record of the presentations and discussions of this event."--
"During the period 1990 to 2010, U.S. job growth occurred primarily in the high-skilled and low-skilled sectors. Yet, one-third of projected job growth for the period 2010 to 2020 will require middle-skilled workers -- who will earn strong middle-class wages and salaries -- important to both the production and consumption components of our economy. These jobs typically require significant training, often requiring more than a high school diploma but less than a baccalaureate degree. In the Gulf of Mexico, middle skilled workers play key roles in maintaining oil system safety, completing the numerous environmental restoration projects needed along the Gulf coast, and as workers in an integrated and resilient public health system. Educational pathways that lead to middle skilled jobs in these areas include: apprenticeship programs offered by schools, unions, and employers; high school career and technical education programs; community college courses, certificates, and associate degrees; and employer provided training. Opportunities for the Gulf Research Program: Middle-Skilled Workforce Needs is the summary of a workshop held on June 9-10, 2014 in Tampa, Florida. This workshop convened 40 thought leaders from the Gulf region's education, employer, and policymaking communities to facilitate a discussion of the current state of education and training pathways for preparing the region's middle-skilled workforce in both the short- and long-term and to identify perceived needs and potential opportunities that might be addressed by the GRP. Workshop participants discussed a variety of opportunities around building capacity in the region's middle-skilled workforce, including the need for competency-based education and training approaches and stronger partnerships among the region's employers and institutions of higher education."--Publisher's website.
"From the Institute of Medicine/National Research Council Report" printed on front cover.
Advanced computing capabilities are used to tackle a rapidly growing range of challenging science and engineering problems, many of which are compute- and data-intensive as well. Demand for advanced computing has been growing for all types and capabilities of systems, from large numbers of single commodity nodes to jobs requiring thousands of cores; for systems with fast interconnects; for systems with excellent data handling and management; and for an increasingly diverse set of applications that includes data analytics as well as modeling and simulation. Since the advent of its supercomputing centers, the National Science Foundation (NSF) has provided its researchers with state-of-the-art computing systems. The growth of new models of computing, including cloud computing and publically available by privately held data repositories, opens up new possibilities for NSF. In order to better understand the expanding and diverse requirements of the science and engineering community and the importance of a new broader range of advanced computing infrastructure, the NSF requested that the National Research Council carry out a study examining anticipated priorities and associated tradeoffs for advanced computing. This interim report identifies key issues and discusses potential options. Future Directions for NSF Advanced Computing Infrastructure to Support U.S. Science and Engineering in 2017-2020 examines priorities and associated tradeoffs for advanced computing in support of NSF-sponsored science and engineering research. This report is an initial compilation of issues to be considered as future NSF strategy, budgets, and programs for advanced computing are developed. Included in the report are questions on which the authoring committee invites comment. We invite your feedback on this report, and more generally, your comments on the future of advanced computing at NSF.
As the availability of high-throughput data-collection technologies, such as information-sensing mobile devices, remote sensing, internet log records, and wireless sensor networks has grown, science, engineering, and business have rapidly transitioned from striving to develop information from scant data to a situation in which the challenge is now that the amount of information exceeds a human's ability to examine, let alone absorb, it. Data sets are increasingly complex, and this potentially increases the problems associated with such concerns as missing information and other quality concerns, data heterogeneity, and differing data formats. The nation's ability to make use of data depends heavily on the availability of a workforce that is properly trained and ready to tackle high-need areas. Training students to be capable in exploiting big data requires experience with statistical analysis, machine learning, and computational infrastructure that permits the real problems associated with massive data to be revealed and, ultimately, addressed. Analysis of big data requires cross-disciplinary skills, including the ability to make modeling decisions while balancing trade-offs between optimization and approximation, all while being attentive to useful metrics and system robustness. To develop those skills in students, it is important to identify whom to teach, that is, the educational background, experience, and characteristics of a prospective data-science student; what to teach, that is, the technical and practical content that should be taught to the student; and how to teach, that is, the structure and organization of a data-science program. Training Students to Extract Value from Big Data summarizes a workshop convened in April 2014 by the National Research Council's Committee on Applied and Theoretical Statistics to explore how best to train students to use big data. The workshop explored the need for training and curricula and coursework that should be included. One impetus for the workshop was the current fragmented view of what is meant by analysis of big data, data analytics, or data science. New graduate programs are introduced regularly, and they have their own notions of what is meant by those terms and, most important, of what students need to know to be proficient in data-intensive work. This report provides a variety of perspectives about those elements and about their integration into courses and curricula.
Historically, regulations governing chemical use have often focused on widely used chemicals and acute human health effects of exposure to them, as well as their potential to cause cancer and other adverse health effects. As scientific knowledge has expanded there has been an increased awareness of the mechanisms through which chemicals may exert harmful effects on human health, as well as their effects on other species and ecosystems. Identification of high-priority chemicals and other chemicals of concern has prompted a growing number of state and local governments, as well as major companies, to take steps beyond existing hazardous chemical federal legislation. Interest in approaches and policies that ensure that any new substances substituted for chemicals of concern are assessed as carefully and thoroughly as possible has also burgeoned. The overarching goal of these approaches is to avoid regrettable substitutions, which occur when a toxic chemical is replaced by another chemical that later proved unsuitable because of persistence, bioaccumulation, toxicity, or other concerns. Chemical alternative assessments are tools designed to facilitate consideration of these factors to assist stakeholders in identifying chemicals that may have the greatest likelihood of harm to human and ecological health, and to provide guidance on how the industry may develop and adopt safer alternatives. A Framework to Guide Selection of Chemical Alternatives develops and demonstrates a decision framework for evaluating potentially safer substitute chemicals as primarily determined by human health and ecological risks. This new framework is informed by previous efforts by regulatory agencies, academic institutions, and others to develop alternative assessment frameworks that could be operationalized. In addition to hazard assessments, the framework incorporates steps for life-cycle thinking - which considers possible impacts of a chemical at all stages including production, use, and disposal - as well as steps for performance and economic assessments. The report also highlights how modern information sources such as computational modeling can supplement traditional toxicology data in the assessment process. This new framework allows the evaluation of the full range of benefits and shortcomings of substitutes, and examination of tradeoffs between these risks and factors such as product functionality, product efficacy, process safety, and resource use. Through case studies, this report demonstrates how different users in contrasting decision contexts with diverse priorities can apply the framework. This report will be an essential resource to the chemical industry, environmentalists, ecologists, and state and local governments.
"Summary of a Workshop on Mississippi River Water Quality Science and Interstate Collaboration summarizes presentations and discussions of Mississippi River and basin water quality management, monitoring, and evaluation programs that took place at a workshop that was held in St. Louis on November 18-19, 2013. The workshop examined a wide array of challenges and progress in water quality monitoring and evaluation in states along the Mississippi River corridor, and provided a forum for experts from U.S. federal agencies, the Mississippi River states, nongovernmental organizations, and the private sector to share and compare monitoring and evaluation experiences from their respective organizations."--Publisher's description.
The assessment of young children's development and learning has recently taken on new importance. Private and government organizations are developing programs to enhance the school readiness of all young children, especially children from economically disadvantaged homes and communities and children with special needs. Well-planned and effective assessment can inform teaching and program improvement, and contribute to better outcomes for children. This book affirms that assessments can make crucial contributions to the improvement of children's well-being, but only if they are well designed, implemented effectively, developed in the context of systematic planning, and are interpreted and used appropriately. Otherwise, assessment of children and programs can have negative consequences for both. The value of assessments therefore requires fundamental attention to their purpose and the design of the larger systems in which they are used. Early Childhood Assessment addresses these issues by identifying the important outcomes for children from birth to age 5 and the quality and purposes of different techniques and instruments for developmental assessments.
The United States Department of Agriculture's (USDA's) Economic Research Service's (ERS) Food Availability Data System includes three distinct but related data series on food and nutrient availability for consumption. The data serve as popular proxies for actual consumption at the national level for over 200 commodities (e.g., fresh spinach, beef, and eggs). The core Food Availability (FA) data series provides data on the amount of food available, per capita, for human consumption in the United States with data back to 1909 for many commodities. The Loss-Adjusted Food Availability (LAFA) data series is derived from the FA data series by adjusting for food spoilage, plate waste, and other losses to more closely approximate 4 actual intake. The LAFA data provide daily estimates of the per capita availability amounts adjusted for loss (e.g., in pounds, ounces, grams, and gallons as appropriate), calories, and food pattern equivalents (i.e., "servings") of the five major food groups (fruit, vegetables, grains, meat, and dairy) available for consumption plus the amounts of added sugars and sweeteners and added fats and oils available for consumption. This fiscal year, as part of its initiative to systematically review all of its major data series, ERS decided to review the FADS data system. One of the goals of this review is to advance the knowledge and understanding of the measurement and technical aspects of the data supporting FADS so the data can be maintained and improved. Data and Research to Improve the U.S. Food Availability System and Estimates of Food Loss is the summary of a workshop convened by the Committee on National Statistics of the National Research Council and the Food and Nutrition Board of the Institute of Medicine to advance knowledge and understanding of the measurement and technical aspects of the data supporting the LAFA data series so that these data series and subsequent food availability and food loss estimates can be maintained and improved. The workshop considered such issues as the effects of termination of selected Census Bureau and USDA data series on estimates for affected food groups and commodities; the potential for using other data sources, such as scanner data, to improve estimates of food availability; and possible ways to improve the data on food loss at the farm and retail levels and at restaurants. This report considers knowledge gaps, data sources that may be available or could be generated to fill gaps, what can be learned from other countries and international organizations, ways to ensure consistency of treatment of commodities across series, and the most promising opportunities for new data for the various food availability series.
"Development Planning provides recommendations to improve development planning for near-term acquisition projects, concepts not quite ready for acquisition, corporate strategic plans, and training of acquisition personnel. This report reviews past uses of development planning by the Air Force, and offers an organizational construct that will help the Air Force across its core functions. Developmental planning, used properly by experienced practitioners, can provide the Air Force leadership with a tool to answer the critical question, Over the next 20 years in 5-year increments, what capability gaps will the Air Force have that must be filled? Development planning will also provide for development of the workforce skills needed to think strategically and to defectively define and close the capability gap. This report describes what development planning could be and should be for the Air Force."--Publisher description.
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.