Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
Disposal of Surplus Plutonium at the Waste Isolation Pilot Plant: Interim Report evaluates the general viability of the U.S. Department of Energy's National Nuclear Security Administration's (DOE-NNSA's) conceptual plans for disposing of 34 metric tons (MT) of surplus plutonium in the Waste Isolation Pilot Plant (WIPP), a deep geologic repository near Carlsbad, New Mexico. This report evaluates DOE-NNSA's plans to ship, receive, and emplace surplus plutonium in WIPP and its understanding of the impacts of these plans on WIPP and WIPP-bound waste streams. This report, the first of two to be issued during this study, provides a preliminary assessment of the general viability of DOE-NNSA's conceptual plans, focusing on some of the barriers to their implementation.
An Assessment of Four Divisions of the Physical Measurement Laboratory at the National Institute of Standards and Technology: Fiscal Year 2018 assesses the scientific and technical work performed by four divisions of the National Institute of Standards and Technology (NIST) Physical Measurement Laboratory. This publication reviews technical reports and technical program descriptions prepared by NIST staff and summarizes the findings of the authoring panel.
"A consensus study report of The National Academy of Sciences, Engineering, Medicine."
To achieve goals for climate and economic growth, "negative emissions technologies" (NETs) that remove and sequester carbon dioxide from the air will need to play a significant role in mitigating climate change. Unlike carbon capture and storage technologies that remove carbon dioxide emissions directly from large point sources such as coal power plants, NETs remove carbon dioxide directly from the atmosphere or enhance natural carbon sinks. Storing the carbon dioxide from NETs has the same impact on the atmosphere and climate as simultaneously preventing an equal amount of carbon dioxide from being emitted. Recent analyses found that deploying NETs may be less expensive and less disruptive than reducing some emissions, such as a substantial portion of agricultural and land-use emissions and some transportation emissions. In 2015, the National Academies published Climate Intervention: Carbon Dioxide Removal and Reliable Sequestration, which described and initially assessed NETs and sequestration technologies. This report acknowledged the relative paucity of research on NETs and recommended development of a research agenda that covers all aspects of NETs from fundamental science to full-scale deployment. To address this need, Negative Emissions Technologies and Reliable Sequestration: A Research Agenda assesses the benefits, risks, and "sustainable scale potential" for NETs and sequestration. This report also defines the essential components of a research and development program, including its estimated costs and potential impact.
The Forum on Cyber Resilience of the National Academies of Sciences, Engineering, and Medicine hosted the Workshop on Recoverability as a First-Class Security Objective on February 8, 2018, in Washington, D.C. The workshop featured presentations from several experts in industry, research, and government roles who spoke about the complex facets of recoverability--that is, the ability to restore normal operations and security in a system affected by software or hardware failure or a deliberate attack. This publication summarizes the presentations and discussions from the workshop.
"Modern science is ever more driven by computations and simulations. In particular, the state of the art in space and Earth science often arises from complex simulations of climate, space weather, and astronomical phenomena. At the same time, scientific work requires data processing, presentation, and analysis through broadly available proprietary and community software. Implicitly or explicitly, software is central to science. Scientific discovery, understanding, validation, and interpretation are all enhanced by access to the source code of the software used by scientists. This report investigates and recommends options for NASA's Science Mission Directorate (SMD) as it considers how to establish a policy regarding open source software to complement its existing policy on open data. In particular, the report reviews existing data and software policies and the lessons learned from the implementation of those policies, summarizes community perspectives, and presents policy options and recommendations for implementing an open source software policy for NASA SMD."--Publisher's description.
"Solving problems related to use of water resources will be of paramount importance in coming decades as increasing pressure from growing populations, climate change, extreme weather, and aging water-related infrastructure threaten water availability and quality. The Water Mission Area (WMA) of the U.S. Geological Survey (USGS) has a long-established reputation for collecting and delivering high-quality, unbiased scientific information related to the nation's water resources. WMA observations help inform decisions ranging from rapid responses during emergencies such as hurricanes, floods, and forest fires, to the long-term management of water resources. Produced at the request of USGS, this report identifies the nation's highest-priority water science and resources challenges over the next 25 years. Future Water Priorities for the Nation summarizes WMA's current water science and research portfolio, and recommends strategic opportunities for WMA to more effectively address the most pressing challenges"--
Is rapid world population growth actually coming to an end? As population growth and its consequences have become front-page issues, projections of slowing growth from such institutions as the United Nations and the World Bank have been called into question. Beyond Six Billion asks what such projections really say, why they say it, whether they can be trusted, and whether they can be improved. The book includes analysis of how well past U.N. and World Bank projections have panned out, what errors have occurred, and why they have happened. Focusing on fertility as one key to accurate projections, the committee examines the transition from high, constant fertility to low fertility levels and discusses whether developing countries will eventually attain the very low levels of births now observed in the industrialized world. Other keys to accurate projections, predictions of lengthening life span and of the impact of international migration on specific countries, are also explored in detail. How good are our methods of population forecasting? How can we cope with the inevitable uncertainty? What population trends can we anticipate? Beyond Six Billion illuminates not only the forces that shape population growth but also the accuracy of the methods we use to quantify these forces and the uncertainty surrounding projections. The Committee on Population was established by the National Academy of Sciences (NAS) in 1983 to bring the knowledge and methods of the population sciences to bear on major issues of science and public policy. The committee's work includes both basic studies of fertility, health and mortality, and migration; and applied studies aimed at improving programs for the public health and welfare in the United States and in developing countries. The committee also fosters communication among researchers in different disciplines and countries and policy makers in government, international agencies, and private organizations. The work of the committee is made possible by funding from several government agencies and private foundations.
Total quality management (TQM), reengineering, the workplace of the twenty-first century--the 1990s have brought a sense of urgency to organizations to change or face stagnation and decline, according to Enhancing Organizational Performance. Organizations are adopting popular management techniques, some scientific, some faddish, often without introducing them properly or adequately measuring the outcome. Enhancing Organizational Performance reviews the most popular current approaches to organizational change--total quality management, reengineering, and downsizing--in terms of how they affect organizations and people, how performance improvements can be measured, and what questions remain to be answered by researchers. The committee explores how theory, doctrine, accepted wisdom, and personal experience have all served as sources for organization design. Alternative organization structures such as teams, specialist networks, associations, and virtual organizations are examined. Enhancing Organizational Performance looks at the influence of the organization's norms, values, and beliefs--its culture--on people and their performance, identifying cultural "levers" available to organization leaders. And what is leadership? The committee sorts through a wealth of research to identify behaviors and skills related to leadership effectiveness. The volume examines techniques for developing these skills and suggests new competencies that will become required with globalization and other trends. Mergers, networks, alliances, coalitions--organizations are increasingly turning to new intra- and inter-organizational structures. Enhancing Organizational Performance discusses how organizations cooperate to maximize outcomes. The committee explores the changing missions of the U.S. Army as a case study that has relevance to any organization. Noting that a musical greeting card contains more computing power than existed in the entire world before 1950, the committee addresses the impact of new technologies on performance. With examples, insights, and practical criteria, Enhancing Organizational Performance clarifies the nature of organizations and the prospects for performance improvement. This book will be important to corporate leaders, executives, and managers; faculty and students in organizational performance and the social sciences; business journalists; researchers; and interested individuals.
The brain ... There is no other part of the human anatomy that is so intriguing. How does it develop and function and why does it sometimes, tragically, degenerate? The answers are complex. In Discovering the Brain, science writer Sandra Ackerman cuts through the complexity to bring this vital topic to the public. The 1990s were declared the "Decade of the Brain" by former President Bush, and the neuroscience community responded with a host of new investigations and conferences. Discovering the Brain is based on the Institute of Medicine conference, Decade of the Brain: Frontiers in Neuroscience and Brain Research. Discovering the Brain is a "field guide" to the brain--an easy-to-read discussion of the brain's physical structure and where functions such as language and music appreciation lie. Ackerman examines: How electrical and chemical signals are conveyed in the brain. The mechanisms by which we see, hear, think, and pay attention--and how a "gut feeling" actually originates in the brain. Learning and memory retention, including parallels to computer memory and what they might tell us about our own mental capacity. Development of the brain throughout the life span, with a look at the aging brain. Ackerman provides an enlightening chapter on the connection between the brain's physical condition and various mental disorders and notes what progress can realistically be made toward the prevention and treatment of stroke and other ailments. Finally, she explores the potential for major advances during the "Decade of the Brain," with a look at medical imaging techniques--what various technologies can and cannot tell us--and how the public and private sectors can contribute to continued advances in neuroscience. This highly readable volume will provide the public and policymakers--and many scientists as well--with a helpful guide to understanding the many discoveries that are sure to be announced throughout the "Decade of the Brain."
"Medium- and heavy-duty trucks, motor coaches, and transit buses - collectively, "medium- and heavy-duty vehicles," or MHDVs - are used in every sector of the economy. The fuel consumption and greenhouse gas emissions of MHDVs have become a focus of legislative and regulatory action in the past few years. Reducing the Fuel Consumption and Greenhouse Gas Emissions of Medium- and Heavy-Duty Vehicles, Phase Two is a follow-on to the National Research Council's 2010 report, Technologies and Approaches to Reducing the Fuel Consumption of Medium-and Heavy-Duty Vehicles. That report provided a series of findings and recommendations on the development of regulations for reducing fuel consumption of MHDVs. This report comprises the first periodic, five-year follow-on to the 2010 report. Reducing the Fuel Consumption and Greenhouse Gas Emissions of Medium- and Heavy-Duty Vehicles, Phase Two reviews NHTSA fuel consumption regulations and considers the technological, market and regulatory factors that may be of relevance to a revised and updated regulatory regime taking effect for model years 2019-2022. The report analyzes and provides options for improvements to the certification and compliance procedures for medium- and heavy-duty vehicles; reviews an updated analysis of the makeup and characterization of the medium- and heavy-duty truck fleet; examines the barriers to and the potential applications of natural gas in class 2b through class 8 vehicles; and addresses uncertainties and performs sensitivity analyses for the fuel consumption and cost/benefit estimates."--Publisher's description.
"Advancing the state of aviation safety is a central mission of the National Aeronautics and Space Administration (NASA). Congress requested this review of NASA's aviation safety-related research programs, seeking an assessment of whether the programs have well-defined, prioritized, and appropriate research objectives; whether resources have been allocated appropriately among these objectives; whether the programs are well coordinated with the safety research programs of the Federal Aviation Administration; and whether suitable mechanisms are in place for transitioning the research results into operational technologies and procedures and certification activities in a timely manner. Advancing Aeronautical Safety contains findings and recommendations with respect to each of the main aspects of the review sought by Congress. These findings indicate that NASA's aeronautics research enterprise has made, and continues to make, valuable contributions to aviation system safety but it is falling short and needs improvement in some key respects."--Publisher's description.
The potential for fatigue to negatively affect human performance is well established. Concern about this potential in the aviation context extends back decades, with both airlines and pilots agreeing that fatigue is a safety concern. A more recent consideration is whether and how pilot commuting, conducted in a pilot's off-duty time, may affect fatigue. The National Academy of Sciences was asked to review available information related to the prevalence and characteristics of pilot commuting; sleep, fatigue, and circadian rhythms; airline and regulatory oversight policies; and pilot and airline practices. This interim report summarizes the committee's review to date of the available information. The final report will present a final review, along with the committee's conclusions and recommendations based on the information available during its deliberations.
Nearly everyone experiences fatigue, but some professions--such as aviation, medicine and the military--demand alert, precise, rapid, and well-informed decision making and communication with little margin for error. The potential for fatigue to negatively affect human performance is well established. Concern about this potential in the aviation context extends back decades, with both airlines and pilots agreeing that fatigue is a safety concern. A more recent consideration is whether and how pilot commuting, conducted in a pilot's off-duty time, may affect fatigue during flight duty. In summer 2010 the U.S. Congress directed the Federal Aviation Administration (FAA) to update the federal regulations that govern pilot flight and duty time, taking into account recent research related to sleep and fatigue. As part of their directive, Congress also instructed FAA to have the National Academy of Sciences conduct a study on the effects of commuting on pilot fatigue. The Effects of Commuting on Pilot Fatigue reviews research and other information related to the prevalence and characteristics of commuting; to the science of sleep, fatigue, and circadian rhythms; to airline and regulatory oversight policies; and to pilot and airline practices. The Effects of Commuting on Pilot Fatigue discusses the policy, economic, and regulatory issues that affect pilot commuting, and outlines potential next steps, including recommendations for regulatory or administrative actions, or further research by the FAA.
The Bureau of Reclamation and Sandia National Laboratories jointly developed the Roadmap to serve as a strategic research pathway for desalination and water purification technologies to meet future water needs. The book recommends that the Roadmap include a sharper focus on the research and technological advancements needed to reach the long-term objectives. The book also suggests that the environmental, economic, and social costs of energy required by increased dependence on desalination be examined. Strategies for implementing the Roadmap initiative are provided.
Progress in information technology (IT) has been remarkable, but the best truly is yet to come: the power of IT as a human enabler is just beginning to be realized. Whether the nation builds on this momentum or plateaus prematurely depends on today's decisions about fundamental research in computer science (CS) and the related fields behind IT. The Computer Science and Telecommunications Board (CSTB) has often been asked to examine how innovation occurs in IT, what the most promising research directions are, and what impacts such innovation might have on society. Consistent themes emerge from CSTB studies, notwithstanding changes in information technology itself, in the IT-producing sector, and in the U.S. university system, a key player in IT research. In this synthesis report, based largely on the eight CSTB reports enumerated below, CSTB highlights these themes and updates some of the data that support them.
Recent rough estimates are that the U.S. Department of Defense (DoD) spends at least $38 billion a year on the research, development, testing, and evaluation of new defense systems; approximately 40 percent of that cost-at least $16 billion-is spent on software development and testing. There is widespread understanding within DoD that the effectiveness of software-intensive defense systems is often hampered by low-quality software as well as increased costs and late delivery of software components. Given the costs involved, even relatively incremental improvements to the software development process for defense systems could represent a large savings in funds. And given the importance of producing defense software that will carry out its intended function, relatively small improvements to the quality of defense software systems would be extremely important to identify. DoD software engineers and test and evaluation officials may not be fully aware of a range of available techniques, because of both the recent development of these techniques and their origination from an orientation somewhat removed from software engineering, i.e., from a statistical perspective. The panel's charge therefore was to convene a workshop to identify statistical software engineering techniques that could have applicability to DoD systems in development.
From 1962 to 1971, US military forces sprayed more than 19 million gallons of herbicides over Vietnam to strip the thick jungle canopy that helped conceal opposition forces, to destroy crops that enemy forces might depend on, and to clear tall grass and bushes from around the perimeters of US base camps and outlying fire-support bases. Most large-scale spraying operations were conducted from airplanes and helicopters, but herbicides were also sprayed from boats and ground vehicles, and by soldiers wearing back-mounted equipment. After a scientific report concluded that a contaminant of one of the primary chemicals used in the herbicide called Agent Orange could cause birth defects in laboratory animals, US forces suspended use of the herbicide; they subsequently halted all herbicide spraying in Vietnam in 1971. At the request of the Veteran's Administration, the Institute of Medicine established a committee to oversee the development and evaluation of models of herbicide exposure for use in studies of Vietnam veterans. That committee would develop and disseminate a request for proposals (RFP) consistent with the recommendations; evaluate the proposals received in response to the RFP and select one or more academic or other nongovernmental research groups to develop the exposure reconstruction model; provide scientific and administrative oversight of the work of the researchers; and evaluate the models developed by the researchers in a report to VA, which would be published for a broader audience. Characterizing Exposure of Veterans to Agent Orange and Other Herbicides Used in Vietnam is the IOM's report that evaluates models of herbicide reconstruction to develop and test models of herbicide exposure for use in studies of Vietnam veterans.
The report evaluates a White Paper written by restoration planners in South Florida on the role of water flow in restoration plans. The report concludes that there is strong evidence that the velocity, rate, and spatial distribution of water flow play important roles in maintaining the tree islands and other ecologically important landscape features of the Everglades.
Assessment in Support of Instruction and Learning is the summary of a National Research Council workshop convened to examine the gap between external and classroom assessment. This report discusses issues associated with designing an assessment system that meets the demands of public accountability and, at the same time, improves the quality of the education that students receive day by day. This report focuses on assessment that addresses both accountability and learning.
In November 1999, GSA and the U.S. Department of State convened a symposium to discuss the apparently conflicting objectives of security from terrorist attack and the design of public buildings in an open society. The symposium sponsors rejected the notion of rigid, prescriptive design approaches. The symposium concluded with a challenge to the design and security professions to craft aesthetically appealing architectural solutions that achieve balanced, performance-based approaches to both openness and security. In response to a request from the Office of the Chief Architect of the Public Buildings Service, the National Research Council (NRC) assembled a panel of independent experts, the Committee to Review the Security Design Criteria of the Interagency Security Committee. This committee was tasked to evaluate the ISC Security Design Criteria to determine whether particular provisions might be too prescriptive to allow a design professional "reasonable flexibility" in achieving desired security and physical protection objectives.
The report provides an independent assessment of suitable test protocols that might be useful and reliable for the testing and evaluation of standoff chemical agent detectors. The report proposes two testing protocols, one for passive detectors and one for active detectors, to help ensure the reliable detection of a release of chemical warfare agents. The report determined that testing these detectors by release of chemical warfare agents into the atmosphere would not provide additional useful information on the effectiveness of these detectors than would a rigorous testing protocol using chemical agents in the laboratory combined with atmospheric release of simulated chemical warfare agents.
The technical, scientific, policy, and institutional environment for conducting Earth science research has been changing rapidly over the past few decades. Changes in the technical environment are due both to the advent of new types and sources of remote sensing data, which have higher spatial and spectral resolution, and to the development of vastly expanded capabilities in data access, visualization, spatial data integration, and data management. The scientific environment is changing because of the strong emphasis on global change research, both nationally and internationally, and the evolving data requirements for that research. And the policy and institutional environment for the production of Earth observation data is changing with the diversification of both remote sensing data and the institutions that produce the data. In this report, the Space Studies Board's Steering Committee on Space Applications and Commercialization explores the implications of this changing environment, examining the opportunities and challenges it presents.
Knowledge of time is essential to precise knowledge of location, and for this reason the Navy, with its need to navigate on the high seas, has historically played an important role in the development and application of advanced time realization and dissemination technologies. Discoveries coming from basic research funded by the Office of Naval Research (ONR) lie at the heart of today's highest performance atomic clocks, Naval Research Laboratory (NRL) expertise played a role in developing the space-qualified atomic clocks that enable the Global Positioning System (GPS), and the U.S. Naval Observatory (USNO) maintains and disseminates the standard of time for all of the Department of Defense (DOD). The Navy has made major investments in most aspects of precision time and time interval (PTTI) science and technology, although specific PTTI-related research has also been funded by the Defense Advanced Research Projects Agency (DARPA) and non-DOD agencies such as the National Science Foundation (NSF), the National Aeronautics and Space Administration (NASA), and the Department of Commerce. Navy funding, largely through ONR, has a history of being an early enabler of key new developments. Judicious funding decisions by the Navy--particularly by ONR program officers--have underpinned most of the major advances in PTTI science and technology (S&T) in the last 50 years. PTTI is important to modern naval needs, and indeed to all the armed Services, for use in both navigation and communications. Precise time synchronization is needed to efficiently determine the start of a code sequence in secure communications, to perform navigation, and to locate the position of signal emitters. Precise frequency control is required in communications for spectrum utilization and frequency-hopped spread-spectrum techniques. There are many examples of essential military operations that depend on PTTI and could benefit from improvements in PTTI technology. These include: -GPS clocks and autonomous operations, -Weapon system four-dimensional coordination, -GPS antijamming, -Network-centric warfare, and -Secure military communications. This report summarizes that reductions in the size, weight, and power requirements and increases in the ruggedness of PTTI devices without sacrificing performance would put more accurate and precise timekeeping in the hands of the warrior, improving capabilities in all of the above operations.
The Committee on Modeling and Simulation Enhancements for 21st Century Manufacturing and Acquisition was formed by the NRC in response to a request from the Defense Modeling and Simulation Office (DMSO) of DOD. The committee was asked to (1) investigate next-generation evolutionary and revolutionary M&S capabilities that will support enhanced defense systems acquisition; (2) identify specific emerging design, testing, and manufacturing process technologies that can be enabled by advanced M&S capabilities; (3) relate these emerging technologies to long-term DOD requirements; (4) assess ongoing efforts to develop advanced M&S capabilities and identify gaps that must be filled to make the emerging technologies a reality; (5) identify lessons learned from industry; and (6) recommend specific government actions to expedite development and to enable maximum DOD and U.S. commercial benefit from these capabilities. To complete its task, the committee identified relevant trends and their impact on defense acquisition needs; current use and support for use of M&S within DOD; lessons learned from commercial manufacturing; three cross-cutting and especially challenging uses of M&S technologies; and the areas in which basic research is needed in M&S in order to achieve the desired goals for manufacturing and defense acquisition.
Recent outbreaks of foot-and-mouth disease (FMD) and bovine spongiform encephalopathy (BSE) in Europe and Japan set off alarm bells in the United States and other nations, prompting a flurry of new regulations, border controls, inspections, and other activities to prevent incursions of the diseases. The terrorist attacks in New York City and Washington, DC, added a new note of urgency to the alarm. Concerned about additional acts of terror or sabotage in various sectors of the economy, including agriculture, U.S. government and industry officials have begun to reevaluate emergency management plans in response to these threats and to shift the focus of research and planning. More than 200 representatives of government, industry, academia, and nongovernmental organizations gathered at a one-day workshop in Washington, DC, on January 15, 2002, to assess what the United States is doing about emerging animal diseases and related issues and to explore what still needs to be done. Major objectives of the workshop include: (1) elucidating information on the U.S. position with regard to potentially threatening animal diseases; (2) identifying critical problems, barriers, and data gaps; and (3) defining potential future National Academies' activities. Emerging Animal Diseases describes the issues presented and discussed by the workshop participants. This report summary extracts the key technical issues from the presentations and discussions, rather than presenting each session and panel discussion separately. Many issues were touched upon repeatedly by several speakers in different sessions, and this format is intended to allow readers who did not attend the workshop to have a good understanding of the discussions in the context of the entire workshop.
Each new generation of commercial aircraft produces less noise and fewer emissions per passenger-kilometer (or ton-kilometer of cargo) than the previous generation. However, the demand for air transportation services grows so quickly that total aircraft noise and emissions continue to increase. Meanwhile, federal, state, and local noise and air quality standards in the United States and overseas have become more stringent. It is becoming more difficult to reconcile public demand for inexpensive, easily accessible air transportation services with concurrent desires to reduce noise, improve local air quality, and protect the global environment against climate change and depletion of stratospheric ozone. This situation calls for federal leadership and strong action from industry and government. U.S. government, industry, and universities conduct research and develop technology that could help reduce aircraft noise and emissions-but only if the results are used to improve operational systems or standards. For example, the (now terminated) Advanced Subsonic Technology Program of the National Aeronautics and Space Administration (NASA) generally brought new technology only to the point where a system, subsystem model, or prototype was demonstrated or could be validated in a relevant environment. Completing the maturation process-by fielding affordable, proven, commercially available systems for installation on new or modified aircraft-was left to industry and generally took place only if industry had an economic or regulatory incentive to make the necessary investment. In response to this situation, the Federal Aviation Administration, NASA, and the Environmental Protection Agency, asked the Aeronautics and Space Engineering Board of the National Research Council to recommend research strategies and approaches that would further efforts to mitigate the environmental effects (i.e., noise and emissions) of aviation. The statement of task required the Committee on Aeronautics Research and Technology for Environmental Compatibility to assess whether existing research policies and programs are likely to foster the technological improvements needed to ensure that environmental constraints do not become a significant barrier to growth of the aviation sector.
This study, commissioned by the National Aeronautics and Space Administration (NASA), examines the role of robotic exploration missions in assessing the risks to the first human missions to Mars. Only those hazards arising from exposure to environmental, chemical, and biological agents on the planet are assessed. To ensure that it was including all previously identified hazards in its study, the Committee on Precursor Measurements Necessary to Support Human Operations on the Surface of Mars referred to the most recent report from NASA's Mars Exploration Program/ Payload Analysis Group (MEPAG) (Greeley, 2001). The committee concluded that the requirements identified in the present NRC report are indeed the only ones essential for NASA to pursue in order to mitigate potential hazards to the first human missions to Mars.
Owing to the expected nature of combat in 2010, U.S. military forces face a pressing need to transform themselves for rapid response to an unpredictable threat. Rapid advances in commercial technology (particularly in electronics), coupled with the easy access to commercial technology enjoyed by potential adversaries, will compel DOD and defense contractors to excel at integrating commercial technology into defense systems. This integration of commercial and military manufacturing (ICMM) has begun on a small scale. By 2010, it needs to increase substantially if U.S. forces are to retain a technological edge. This report assesses the opportunities for increased ICMM in 2010 and beyond, identifies barriers, and recommends strategies for overcoming them.
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.