Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
In 2011, the International Monetary Fund invited prominent economists and economic policymakers to consider the brave new world of the post-crisis global economy. The result is a book that captures the state of macroeconomic thinking at a transformational moment. The crisis and the weak recovery that has followed raise fundamental questions concerning macroeconomics and economic policy. These top economists discuss future directions for monetary policy, fiscal policy, financial regulation, capital-account management, growth strategies, the international monetary system, and the economic models that should underpin thinking about critical policy choices.
A thorough exposition of quantum computing and the underlying concepts of quantum physics, with explanations of the relevant mathematics and numerous examples. The combination of two of the twentieth century's most influential and revolutionary scientific theories, information theory and quantum mechanics, gave rise to a radically new view of computing and information. Quantum information processing explores the implications of using quantum mechanics instead of classical mechanics to model information and its processing. Quantum computing is not about changing the physical substrate on which computation is done from classical to quantum but about changing the notion of computation itself, at the most basic level. The fundamental unit of computation is no longer the bit but the quantum bit or qubit. This comprehensive introduction to the field offers a thorough exposition of quantum computing and the underlying concepts of quantum physics, explaining all the relevant mathematics and offering numerous examples. With its careful development of concepts and thorough explanations, the book makes quantum computing accessible to students and professionals in mathematics, computer science, and engineering. A reader with no prior knowledge of quantum physics (but with sufficient knowledge of linear algebra) will be able to gain a fluent understanding by working through the book.
The new edition of an introduction to computer programming within the context of the visual arts, using the open-source programming language Processing; thoroughly updated throughout. The visual arts are rapidly changing as media moves into the web, mobile devices, and architecture. When designers and artists learn the basics of writing software, they develop a new form of literacy that enables them to create new media for the present, and to imagine future media that are beyond the capacities of current software tools. This book introduces this new literacy by teaching computer programming within the context of the visual arts. It offers a comprehensive reference and text for Processing (www.processing.org), an open-source programming language that can be used by students, artists, designers, architects, researchers, and anyone who wants to program images, animation, and interactivity. Written by Processing's cofounders, the book offers a definitive reference for students and professionals. Tutorial chapters make up the bulk of the book; advanced professional projects from such domains as animation, performance, and installation are discussed in interviews with their creators.This second edition has been thoroughly updated. It is the first book to offer in-depth coverage of Processing 2.0 and 3.0, and all examples have been updated for the new syntax. Every chapter has been revised, and new chapters introduce new ways to work with data and geometry. New "synthesis” chapters offer discussion and worked examples of such topics as sketching with code, modularity, and algorithms. New interviews have been added that cover a wider range of projects. "Extension” chapters are now offered online so they can be updated to keep pace with technological developments in such fields as computer vision and electronics.InterviewsSUE.C, Larry Cuba, Mark Hansen, Lynn Hershman Leeson, Jürg Lehni, LettError, Golan Levin and Zachary Lieberman, Benjamin Maus, Manfred Mohr, Ash Nehru, Josh On, Bob Sabiston, Jennifer Steinkamp, Jared Tarbell, Steph Thirion, Robert Winter
A program for building a global clean energy economy while expanding job opportunities and economic well-being.In order to control climate change, the International Panel on Climate Change (IPCC) estimates that greenhouse gas emissions will need to fall by about forty percent by 2030. Achieving the target goals will be highly challenging. Yet in Greening the Global Economy, economist Robert Pollin shows that they are attainable through steady, large-scale investments—totaling about 1.5 percent of global GDP on an annual basis—in both energy efficiency and clean renewable energy sources. Not only that: Pollin argues that with the right investments, these efforts will expand employment and drive economic growth.Drawing on years of research, Pollin explores all aspects of the problem: how much energy will be needed in a range of industrialized and developing economies; what efficiency targets should be; and what kinds of industrial policy will maximize investment and support private and public partnerships in green growth so that a clean energy transformation can unfold without broad subsidies. All too frequently, inaction on climate change is blamed on its potential harm to the economy. Pollin shows greening the economy is not only possible but necessary: global economic growth depends on it.
The book is based on an international research project that analyzed sixty LEPs, among them the Boston Harbor cleanup; the first phase of subway construction in Ankara, Turkey; a hydro dam on the Caroni River in Venezuela; and the construction of offshore oil platforms west of Flor, Norway.As the number, complexity, and scope of large engineering projects (LEPs) increase worldwide, the huge stakes may endanger the survival of corporations and threaten the stability of countries that approach these projects unprepared. According to the authors, the "e;front-end"e; engineering of institutional arrangements and strategic systems is a far greater determinant of an LEP's success than are the more tangible aspects of project engineering and management. The book is based on an international research project that analyzed sixty LEPs, among them the Boston Harbor cleanup; the first phase of subway construction in Ankara, Turkey; a hydro dam on the Caroni River in Venezuela; and the construction of offshore oil platforms west of Flor, Norway. The authors use the research results to develop an experience-based theoretical framework that will allow managers to understand and respond to the complexity and uncertainty inherent in all LEPs. In addition to managers and scholars of large-scale projects, the book will be of interest to those studying the relationship between institutions and strategy, risk management, and corporate governance in general.Contributors Bjorn Andersen, Richard Brealey, Ian Cooper, Serghei Floricel, Michel Habib, Brian Hobbs, Donald R. Lessard, Pascale Michaud, Roger Miller, Xavier Olleros
In this rigorous investigation into the logic of truth Anil Gupta and Nuel Belnap explain how the concept of truth works in both ordinary and pathological contexts. The latter include, for instance, contexts that generate Liar Paradox. Their central claim is that truth is a circular concept. In support of this claim they provide a widely applicable theory (the revision theory) of circular concepts. Under the revision theory, when truth is seen as circular both its ordinary features and its pathological features fall into a simple understandable pattern.The Revision Theory of Truth is unique in placing truth in the context of a general theory of definitions. This theory makes sense of arbitrary systems of mutually interdependent concepts, of which circular concepts, such as truth, are but a special case.
A groundbreaking analysis of the relationship between culture and technology.
A detailed analysis of the policy effects of conservatives' decades-long effort to dismantle the federal regulatory framework for environmental protection.
A pioneering proposal for a pluralistic extension of evolutionary theory, now updated to reflect the most recent research. This new edition of the widely read Evolution in Four Dimensions has been revised to reflect the spate of new discoveries in biology since the book was first published in 2005, offering corrections, an updated bibliography, and a substantial new chapter. Eva Jablonka and Marion Lamb's pioneering argument proposes that there is more to heredity than genes. They describe four "dimensions” in heredity—four inheritance systems that play a role in evolution: genetic, epigenetic (or non-DNA cellular transmission of traits), behavioral, and symbolic (transmission through language and other forms of symbolic communication). These systems, they argue, can all provide variations on which natural selection can act. Jablonka and Lamb present a richer, more complex view of evolution than that offered by the gene-based Modern Synthesis, arguing that induced and acquired changes also play a role. Their lucid and accessible text is accompanied by artist-physician Anna Zeligowski's lively drawings, which humorously and effectively illustrate the authors' points. Each chapter ends with a dialogue in which the authors refine their arguments against the vigorous skepticism of the fictional "I.M.” (for Ipcha Mistabra—Aramaic for "the opposite conjecture”). The extensive new chapter, presented engagingly as a dialogue with I.M., updates the information on each of the four dimensions—with special attention to the epigenetic, where there has been an explosion of new research. Praise for the first edition"With courage and verve, and in a style accessible to general readers, Jablonka and Lamb lay out some of the exciting new pathways of Darwinian evolution that have been uncovered by contemporary research.”—Evelyn Fox Keller, MIT, author of Making Sense of Life: Explaining Biological Development with Models, Metaphors, and Machines"In their beautifully written and impressively argued new book, Jablonka and Lamb show that the evidence from more than fifty years of molecular, behavioral and linguistic studies forces us to reevaluate our inherited understanding of evolution.”—Oren Harman, The New Republic"It is not only an enjoyable read, replete with ideas and facts of interest but it does the most valuable thing a book can do—it makes you think and reexamine your premises and long-held conclusions.”—Adam Wilkins, BioEssays
An examination of informal urban activities-including street vending, garage sales, and unpermitted housing-that explores their complexity and addresses related planning and regulatory issues.
A theory of HCI that uses concepts from semiotics and computer science to focus on the communication between designers and users during interaction.
Scholars and artists revisit a hugely influential essay by Rosalind Krauss and map the interactions between art and architecture over the last thirty-five years.Expansion, convergence, adjacency, projection, rapport, and intersection are a few of the terms used to redraw the boundaries between art and architecture during the last thirty-five years. If modernists invented the model of an ostensible "synthesis of the arts,” their postmodern progeny promoted the semblance of pluralist fusion. In 1979, reacting against contemporary art's transformation of modernist medium-specificity into postmodernist medium multiplicity, the art historian Rosalind Krauss published an essay, "Sculpture in the Expanded Field,” that laid out in a precise diagram the structural parameters of sculpture, architecture, and landscape art. Krauss tried to clarify what these art practices were, what they were not, and what they could become if logically combined. The essay soon assumed a canonical status and affected subsequent developments in all three fields. Retracing the Expanded Field revisits Krauss's hugely influential text and maps the ensuing interactions between art and architecture.Responding to Krauss and revisiting the milieu from which her text emerged, artists, architects, and art historians of different generations offer their perspectives on the legacy of "Sculpture in the Expanded Field.” Krauss herself takes part in a roundtable discussion (moderated by Hal Foster). A selection of historical documents, including Krauss's essay, presented as it appeared in October, accompany the main text. Neither eulogy nor hagiography, Retracing the Expanded Field documents the groundbreaking nature of Krauss's authoritative text and reveals the complex interchanges between art and architecture that increasingly shape both fields.ContributorsStan Allen, George Baker, Yve-Alain Bois, Benjamin Buchloh, Beatriz Colomina, Penelope Curtis, Sam Durant, Edward Eigen, Kurt W. Forster, Hal Foster, Kenneth Frampton, Branden W. Joseph, Rosalind Krauss, Miwon Kwon, Sylvia Lavin, Sandro Marpillero, Josiah McElheny, Eve Meltzer, Michael Meredith, Mary Miss, Sarah Oppenheimer, Matthew Ritchie, Julia Robinson, Joe Scanlan, Emily Eliza Scott, Irene Small, Philip Ursprung, Anthony Vidler
The fifth edition of a work that defines the field of cognitive neuroscience, with entirely new material that reflects recent advances in the field.
New ways to design spaces for online interaction--and how they will change society.
A comprehensive guide to the conceptual, mathematical, and implementational aspects of analyzing electrical brain signals, including data from MEG, EEG, and LFP recordings.This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG), electroencephalography (EEG), and local field potential (LFP) recordings from humans and nonhuman animals. It is the only book on the topic that covers both the theoretical background and the implementation in language that can be understood by readers without extensive formal training in mathematics, including cognitive scientists, neuroscientists, and psychologists.Readers who go through the book chapter by chapter and implement the examples in Matlab will develop an understanding of why and how analyses are performed, how to interpret results, what the methodological issues are, and how to perform single-subject-level and group-level analyses. Researchers who are familiar with using automated programs to perform advanced analyses will learn what happens when they click the "analyze now” button.The book provides sample data and downloadable Matlab code. Each of the 38 chapters covers one analysis topic, and these topics progress from simple to advanced. Most chapters conclude with exercises that further develop the material covered in the chapter. Many of the methods presented (including convolution, the Fourier transform, and Euler's formula) are fundamental and form the groundwork for other advanced data analysis methods. Readers who master the methods in the book will be well prepared to learn other approaches.
In Speaking, Willem "Pim" Levelt, Director of the Max-Planck-Institut für Psycholinguistik, accomplishes the formidable task of covering the entire process of speech production, from constraints on conversational appropriateness to articulation and self-monitoring of speech. Speaking is unique in its balanced coverage of all major aspects of the production of speech, in the completeness of its treatment of the entire speech process, and in its strategy of exemplifying rather than formalizing theoretical issues.
In this text, sociologist and art critic Kathryn Henderson offers a perpsective on this topic by exploring the impact of computer graphic systems on the visual culture of engineering design. Henderson shows how designers use drawings both to organize resources, political support and power.
Bayes or Bust? provides the first balanced treatment of the complex set of issues involved in this nagging conundrum in the philosophy of science.
Provides a formal theory of nonmarket failure, analyzing such problems as redundant costs, monopoly, frequency of unanticipated externalities, and bureaucracy in such nonmarket institutions as foundations, universities, and government.
How were huge stones moved from quarries to the sites of Egyptian pyramids? How did the cathedral builders of the Middle Ages lift blocks to great heights by muscle power alone? In this intriguing book John Fitchen explains and illustrates the solutions to these and many other puzzles in preindustrial building construction. This is the first general survey of the practices and role of the builder (as opposed to the designer) in constructing an array of structures. Fitchen's approach gives a valuable hands-on feel for what it's like to work with ropes and ladders, wedges and slings; with crews engaged in well digging, bridge building, and the transporting of obelisks hundreds of miles by water and over land. The buildings discussed range from the tents, tepees, and igloos of nomadic tribes to the monumental pyramids of Egypt, the temples of Greece, the aqueducts of Rome, and the cathedrals of medieval Europe.
According to Peter Ludlow, there is a very close relation between the structure of natural language and that of reality, and one can gain insights into long-standing metaphysical questions by studying the semantics of natural language. In this book Ludlow uses the metaphysics of time as a case study and focuses on the dispute between A-theorists and B-theorists about the nature of time. According to B-theorists, there is no genuine change, but a permanent sequence of events ordered by an earlier-than/later-than relation. According to the version of the A-theory adopted by Ludlow (a position sometimes called presentism), there are no past or future events or times; what makes something past or future is how the world stands right now.Ludlow argues that each metaphysical picture is tied to a particular semantical theory of tense and that the dispute can be adjudicated on semantical grounds. A presentism-compatible semantics, he claims, is superior to a B-theory semantics in a number of respects, including its abilities to handle the indexical nature of temporal discourse and to account for facts about language acquisition. Along the way, Ludlow develops a conception of E-type temporal anaphora that can account for both temporal anaphora and complex tenses without reference to past and future events. His view has philosophical consequences for theories of logic, self-knowledge, and memory. As for linguistic consequences, Ludlow suggests that the very idea of grammatical tense may have to be dispensed with and replaced with some combination of aspect, modality, and evidentiality.
Vinod Goel argues that the cognitive computational conception of the world requires our thought processes to be precise, rigid, discrete, and unambiguous; yet there are dense, ambiguous, and amorphous symbol systems, like sketching, painting, and poetry, found in the arts and much of everyday discourse that have an important, nontrivial place in cognition.
This collection of the author's major papers from the last 30 years covers: scientific discoveries; his views on innovation and entrepreneurship; his reflections on his own field of chemical engineering; and his research on the global marketplace.
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.