Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
A unified, comprehensive, and up-to-date introduction to the analytical and numerical tools for solving dynamic economic problems.
How biases, the desire for a good narrative, reliance on citation metrics, and other problems undermine confidence in modern science.Modern science is built on experimental evidence, yet scientists are often very selective in deciding what evidence to use and tend to disagree about how to interpret it. In The Matter of Facts, Gareth and Rhodri Leng explore how scientists produce and use evidence. They do so to contextualize an array of problems confronting modern science that have raised concerns about its reliability: the widespread use of inappropriate statistical tests, a shortage of replication studies, and a bias in both publishing and citing "positive” results. Before these problems can be addressed meaningfully, the authors argue, we must understand what makes science work and what leads it astray. The myth of science is that scientists constantly challenge their own thinking. But in reality, all scientists are in the business of persuading other scientists of the importance of their own ideas, and they do so by combining reason with rhetoric. Often, they look for evidence that will support their ideas, not for evidence that might contradict them; often, they present evidence in a way that makes it appear to be supportive; and often, they ignore inconvenient evidence.In a series of essays focusing on controversies, disputes, and discoveries, the authors vividly portray science as a human activity, driven by passion as well as by reason. By analyzing the fluidity of scientific concepts and the dynamic and unpredictable development of scientific fields, the authors paint a picture of modern science and the pressures it faces.
The second edition of a comprehensive account of all the major aspects of the Japanese economy, substantially updated and expanded.This textbook offers a comprehensive, rigorous but accessible account of all the major aspects of the Japanese economy, grounding its approach in mainstream economics. The second edition has been extensively revised and substantially updated, with new material that covers Japan's period of economic stagnation between 1991 and 2010. The first edition, published in 1992, focused on Japan as a success story of catch-up economic development; this edition reflects the lessons learned from Japan's Lost Two Decades.After presenting the historical background, the book begins with macroeconomics, studying growth and business cycles. It then covers essential policy issues, with new material that takes into account the Japanese banking crisis of 1997-1998 and the global financial crisis of 2008-2009, discussing financial regulation, monetary policy, and fiscal policy. It goes on to examine saving, demography, and social security in light of Japan's ongoing demographic transition; industrial organization; labor markets; international trade and international finance; and the Japan-U.S. relationship. A new chapter offers a detailed analysis of the Lost Two Decades, synthesizing and applying concepts discussed in previous chapters and offering insights into such issues as successful catch-up growth, demographic shifts, and credit booms and busts.
How the tools of design research can involve designers more directly with objects, products and services they design; from human-centered research methods to formal experimentation, process models, and application to real world design problems.The tools of design research, writes Brenda Laurel, will allow designers "to claim and direct the power of their profession." Often neglected in the various curricula of design schools, the new models of design research described in this book help designers to investigate people, form, and process in ways that can make their work more potent and more delightful. "At the very least," Peter Lunenfeld writes in the preface, "design research saves us from reinventing the wheel. At its best, a lively research methodology can reinvigorate the passion that so often fades after designers join the profession." The goal of the book is to introduce designers to the many research tools that can be used to inform design as well as to ideas about how and when to deploy them effectively. The chapter authors come from diverse institutions and enterprises, including Stanford University, MIT, Intel, Maxis, Studio Anybody, Sweden's HUMlab, and Big Blue Dot. Each has something to say about how designers make themselves better at what they do through research, and illustrates it with real world examples—case studies, anecdotes, and images. Topics of this multi-voice conversation include qualitative and quantitative methods, performance ethnography and design improvisation, trend research, cultural diversity, formal and structural research practice, tactical discussions of design research process, and case studies drawn from areas as unique as computer games, museum information systems, and movies. Interspersed throughout the book are one-page "demos," snapshots of the design research experience. Design Research charts the paths from research methods to research findings to design principles to design results and demonstrates the transformation of theory into a richly satisfying and more reliably successful practice.
This primer is designed to teach students the interconnected arts of visual communication. The subject is presented, not as a foreign language, but as a native one that the student "knows" but cannot yet "read."Responding to the need she so clearly perceives, Ms. Dondis, a designer and teacher of broad experience, has provided a beginning text for art and design students and a basic text for all other students; those who do not intend to become artists or designers but who need to acquire the essential skills of understanding visual communication at a time when so much information is being studied and transmitted in non-verbal modes, especially through photography and film. Understanding through seeing only seems to be an obviously intuitive process. Actually, developing the visual sense is something like learning a language, with its own special alphabet, lexicon, and syntax. People find it necessary to be verbally literate whether they are "writers": or not; they should find it equally necessary to be visually literate, "artists" or not. This primer is designed to teach students the interconnected arts of visual communication. The subject is presented, not as a foreign language, but as a native one that the student "knows" but cannot yet "read." The analogy provides a useful teaching method, in part because it is not overworked or too rigorously applied. This method of learning to see and read visual data has already been proved in practice, in settings ranging from Harlem to suburbia. Appropriately, the book makes some of its most telling points through visual means. Numerous illustrated examples are employed to clarify the basic elements of design (teach an alphabet), to show how they are used in simple syntactic combinations ("See Jane run."), and finally, to present the meaningful synthesis of visual information that is a finished work of art (the apprehension of poetry...).
A practitioner's guide to the basic principles of creating sound effects using easily accessed free software.Designing Sound teaches students and professional sound designers to understand and create sound effects starting from nothing. Its thesis is that any sound can be generated from first principles, guided by analysis and synthesis. The text takes a practitioner's perspective, exploring the basic principles of making ordinary, everyday sounds using an easily accessed free software. Readers use the Pure Data (Pd) language to construct sound objects, which are more flexible and useful than recordings. Sound is considered as a process, rather than as data—an approach sometimes known as "procedural audio.” Procedural sound is a living sound effect that can run as computer code and be changed in real time according to unpredictable events. Applications include video games, film, animation, and media in which sound is part of an interactive process. The book takes a practical, systematic approach to the subject, teaching by example and providing background information that offers a firm theoretical context for its pragmatic stance. [Many of the examples follow a pattern, beginning with a discussion of the nature and physics of a sound, proceeding through the development of models and the implementation of examples, to the final step of producing a Pure Data program for the desired sound. Different synthesis methods are discussed, analyzed, and refined throughout.] After mastering the techniques presented in Designing Sound, students will be able to build their own sound objects for use in interactive applications and other projects
How native people-from the Miwoks of Yosemite to the Maasai of eastern Africa-have been displaced from their lands in the name of conservation.
These photographs of grain elevators in America, Germany, Belgium, and France are a major addition to the Bechers' ongoing documentation of the vanishing buildings that once defined the industrial landscape of Europe and America.
Polemics and reflections on how to bridge the gap between what architecture actually is and what architects want it to be.
A comprehensive guide to the conceptual, mathematical, and implementational aspects of analyzing electrical brain signals, including data from MEG, EEG, and LFP recordings.This book offers a comprehensive guide to the theory and practice of analyzing electrical brain signals. It explains the conceptual, mathematical, and implementational (via Matlab programming) aspects of time-, time-frequency- and synchronization-based analyses of magnetoencephalography (MEG), electroencephalography (EEG), and local field potential (LFP) recordings from humans and nonhuman animals. It is the only book on the topic that covers both the theoretical background and the implementation in language that can be understood by readers without extensive formal training in mathematics, including cognitive scientists, neuroscientists, and psychologists.Readers who go through the book chapter by chapter and implement the examples in Matlab will develop an understanding of why and how analyses are performed, how to interpret results, what the methodological issues are, and how to perform single-subject-level and group-level analyses. Researchers who are familiar with using automated programs to perform advanced analyses will learn what happens when they click the "analyze now” button.The book provides sample data and downloadable Matlab code. Each of the 38 chapters covers one analysis topic, and these topics progress from simple to advanced. Most chapters conclude with exercises that further develop the material covered in the chapter. Many of the methods presented (including convolution, the Fourier transform, and Euler's formula) are fundamental and form the groundwork for other advanced data analysis methods. Readers who master the methods in the book will be well prepared to learn other approaches.
A historical study of Chile's twin experiments with cybernetics and socialism, and what they tell us about the relationship of technology and politics.In Cybernetic Revolutionaries, Eden Medina tells the history of two intersecting utopian visions, one political and one technological. The first was Chile's experiment with peaceful socialist change under Salvador Allende; the second was the simultaneous attempt to build a computer system that would manage Chile's economy. Neither vision was fully realized—Allende's government ended with a violent military coup; the system, known as Project Cybersyn, was never completely implemented—but they hold lessons for today about the relationship between technology and politics.Drawing on extensive archival material and interviews, Medina examines the cybernetic system envisioned by the Chilean government—which was to feature holistic system design, decentralized management, human-computer interaction, a national telex network, near real-time control of the growing industrial sector, and modeling the behavior of dynamic systems. She also describes, and documents with photographs, the network's Star Trek-like operations room, which featured swivel chairs with armrest control panels, a wall of screens displaying data, and flashing red lights to indicate economic emergencies.Studying project Cybersyn today helps us understand not only the technological ambitions of a government in the midst of political change but also the limitations of the Chilean revolution. This history further shows how human attempts to combine the political and the technological with the goal of creating a more just society can open new technological, intellectual, and political possibilities. Technologies, Medina writes, are historical texts; when we read them we are reading history.
The life and work of a scientist who spent his career crossing disciplinary boundaries-from experimental neurology to psychiatry to cybernetics to engineering.
How do media find an audience when there is an endless supply of content but a limited supply of public attention?
The social, political, and cultural consequences of attempts to cheat death by freezing life.
Why small business is not the basis of American prosperity, not the foundation of American democracy, and not the champion of job creation.
Ancient history, midcentury modernism, Cinemascope, humanism and monumentality, totalitarianism and democracy: transformations in American culture and architecture.In Flintstone Modernism, Jeffrey Lieber investigates transformations in postwar American architecture and culture. He considers sword-and-sandal films of the 1950s and 1960s—including forgotten gems such as Land of the Pharaohs, Helen of Troy, and The Egyptian—and their protean, ideologically charged representations of totalitarianism and democracy. He connects Cinemascope and other widescreen technologies to the architectural "glass curtain wall,” arguing that both represented the all-encompassing eye of American Enterprise. Lieber reminds us that until recently midcentury modern American architecture was reviled by architectural historians but celebrated by design enthusiasts, just as sword-and-sandal epics are alternately hailed as cult classics or derided as camp.Lieber's argument is absorbing, exuberant, and comprehensive. Following Hannah Arendt, who looked for analogies in the classical past in order to understand midcentury's cultural crisis, Lieber terms the postwar reckoning of ancient civilizations and modern ideals "Flintstone modernism.” In new assessments of the major architects of the period, Lieber uncovers the cultural and political fantasies that animated or impinged on their work, offering surprising insights into Gordon Bunshaft's commonsense classicism; Eero Saarinen's architectural narratives of ersatz empire and Marcel Breuer's mania for Egyptian monoliths; and Edward Durell Stone's romantic "flights of fancy” and Philip Johnson's wicked brand of cynical cultural and sociopolitical critique.Deftly moving among architecture, film, philosophy, and politics, Lieber illuminates the artifice that resulted from the conjunction of high style and mass-cultural values in postwar America.
A comprehensive introduction to optimization with a focus on practical algorithms for the design of engineering systems.This book offers a comprehensive introduction to optimization with a focus on practical algorithms. The book approaches optimization from an engineering perspective, where the objective is to design a system that optimizes a set of metrics subject to constraints. Readers will learn about computational approaches for a range of challenges, including searching high-dimensional spaces, handling problems where there are multiple competing objectives, and accommodating uncertainty in the metrics. Figures, examples, and exercises convey the intuition behind the mathematical approaches. The text provides concrete implementations in the Julia programming language. Topics covered include derivatives and their generalization to multiple dimensions; local descent and first- and second-order methods that inform local descent; stochastic methods, which introduce randomness into the optimization process; linear constrained optimization, when both the objective function and the constraints are linear; surrogate models, probabilistic surrogate models, and using probabilistic surrogate models to guide optimization; optimization under uncertainty; uncertainty propagation; expression optimization; and multidisciplinary design optimization. Appendixes offer an introduction to the Julia language, test functions for evaluating algorithm performance, and mathematical concepts used in the derivation and analysis of the optimization methods discussed in the text. The book can be used by advanced undergraduates and graduate students in mathematics, statistics, computer science, any engineering field, (including electrical engineering and aerospace engineering), and operations research, and as a reference for professionals.
An argument that—despite dramatic advances in the field—artificial intelligence is nowhere near developing systems that are genuinely intelligent.In this provocative book, Brian Cantwell Smith argues that artificial intelligence is nowhere near developing systems that are genuinely intelligent. Second wave AI, machine learning, even visions of third-wave AI: none will lead to human-level intelligence and judgment, which have been honed over millennia. Recent advances in AI may be of epochal significance, but human intelligence is of a different order than even the most powerful calculative ability enabled by new computational capacities. Smith calls this AI ability "reckoning,” and argues that it does not lead to full human judgment—dispassionate, deliberative thought grounded in ethical commitment and responsible action.Taking judgment as the ultimate goal of intelligence, Smith examines the history of AI from its first-wave origins ("good old-fashioned AI,” or GOFAI) to such celebrated second-wave approaches as machine learning, paying particular attention to recent advances that have led to excitement, anxiety, and debate. He considers each AI technology's underlying assumptions, the conceptions of intelligence targeted at each stage, and the successes achieved so far. Smith unpacks the notion of intelligence itself—what sort humans have, and what sort AI aims at. Smith worries that, impressed by AI's reckoning prowess, we will shift our expectations of human intelligence. What we should do, he argues, is learn to use AI for the reckoning tasks at which it excels while we strengthen our commitment to judgment, ethics, and the world.
Explorations of the many ways of being material in the digital age.In his oracular 1995 book Being Digital, Nicholas Negroponte predicted that social relations, media, and commerce would move from the realm of "atoms to bits”—that human affairs would be increasingly untethered from the material world. And yet in 2019, an age dominated by the digital, we have not quite left the material world behind. In Being Material, artists and technologists explore the relationship of the digital to the material, demonstrating that processes that seem wholly immaterial function within material constraints. Digital technologies themselves, they remind us, are material things—constituted by atoms of gold, silver, silicon, copper, tin, tungsten, and more. The contributors explore five modes of being material: programmable, wearable, livable, invisible, and audible. Their contributions take the form of reports, manifestos, philosophical essays, and artist portfolios, among other configurations. The book's cover merges the possibilities of paper with those of the digital, featuring a bookmark-like card that, when "seen” by a smartphone, generates graphic arrangements that unlock films, music, and other dynamic content on the book's website. At once artist's book, digitally activated object, and collection of scholarship, this book both demonstrates and chronicles the many ways of being material.ContributorsChristina Agapakis, Azra Aksamija, Sandy Alexandre, Dewa Alit, George Barbastathis, Maya Beiser, Marie-Pier Boucher, Benjamin H. Bratton, Hussein Chalayan, Jim Cybulski, Tal Danino, Deborah G. Douglas, Arnold Dreyblatt, M. Amah Edoh, Michelle Tolini Finamore, Team Foldscope and Global Foldscope community, Ben Fry, Victor Gama, Stefan Helmreich, Hyphen-Labs, Leila Kinney, Rebecca Konte, Winona LaDuke, Brendan Landis, Grace Leslie, Bill Maurer, Lucy McRae, Tom Özden-Schilling, Trevor Paglen, Lisa Parks, Nadya Peek, Claire Pentecost, Manu Prakash,Casey Reas, Pawel Romanczuk, Natasha D. Schüll, Nick Shapiro, Skylar Tibbits, Rebecca Uchill, Evan ZiporynBook Design: E Roon KangElectronics, interactions, and product designer: Marcelo Coelho
A self-contained introduction to abstract interpretation-based static analysis, an essential resource for students, developers, and users.Static program analysis, or static analysis, aims to discover semantic properties of programs without running them. It plays an important role in all phases of development, including verification of specifications and programs, the synthesis of optimized code, and the refactoring and maintenance of software applications. This book offers a self-contained introduction to static analysis, covering the basics of both theoretical foundations and practical considerations in the use of static analysis tools. By offering a quick and comprehensive introduction for nonspecialists, the book fills a notable gap in the literature, which until now has consisted largely of scientific articles on advanced topics.The text covers the mathematical foundations of static analysis, including semantics, semantic abstraction, and computation of program invariants; more advanced notions and techniques, including techniques for enhancing the cost-accuracy balance of analysis and abstractions for advanced programming features and answering a wide range of semantic questions; and techniques for implementing and using static analysis tools. It begins with background information and an intuitive and informal introduction to the main static analysis principles and techniques. It then formalizes the scientific foundations of program analysis techniques, considers practical aspects of implementation, and presents more advanced applications. The book can be used as a textbook in advanced undergraduate and graduate courses in static analysis and program verification, and as a reference for users, developers, and experts.
This advanced text introduces the principles of noncooperative game theory in a direct and uncomplicated style that will acquaint students with the broad spectrum of the field while highlighting and explaining what they need to know at any given point.This advanced text introduces the principles of noncooperative game theory—including strategic form games, Nash equilibria, subgame perfection, repeated games, and games of incomplete information—in a direct and uncomplicated style that will acquaint students with the broad spectrum of the field while highlighting and explaining what they need to know at any given point. The analytic material is accompanied by many applications, examples, and exercises. The theory of noncooperative games studies the behavior of agents in any situation where each agent's optimal choice may depend on a forecast of the opponents' choices. "Noncooperative" refers to choices that are based on the participant's perceived selfinterest. Although game theory has been applied to many fields, Fudenberg and Tirole focus on the kinds of game theory that have been most useful in the study of economic problems. They also include some applications to political science. The fourteen chapters are grouped in parts that cover static games of complete information, dynamic games of complete information, static games of incomplete information, dynamic games of incomplete information, and advanced topics.
The Theory of Industrial Organization is the first primary text to treat the new industrial organization at the advanced-undergraduate and graduate level. Rigorously analytical and filled with exercises coded to indicate level of difficulty, it provides a unified and modern treatment of the field with accessible models that are simplified to highlight robust economic ideas while working at an intuitive level. To aid students at different levels, each chapter is divided into a main text and supplementary section containing more advanced material. Each chapter opens with elementary models and builds on this base to incorporate current research in a coherent synthesis. Tirole begins with a background discussion of the theory of the firm.In Part I he develops the modern theory of monopoly, addressing single product and multi product pricing, static and intertemporal price discrimination, quality choice, reputation, and vertical restraints. In Part II, Tirole takes up strategic interaction between firms, starting with a novel treatment of the Bertrand-Cournot interdependent pricing problem. He studies how capacity constraints, repeated interaction, product positioning, advertising, and asymmetric information affect competition or tacit collusion. He then develops topics having to do with long term competition, including barriers to entry, contestability, exit, and research and development. He concludes with a "game theory user's manual" and a section of review exercises.Important Notice: The digital edition of this book is missing some of the images found in the physical edition.
Statistical approaches to processing natural language text have become dominant in recent years. This foundational text is the first comprehensive introduction to statistical natural language processing (NLP) to appear. The book contains all the theory and algorithms needed for building NLP tools. It provides broad but rigorous coverage of mathematical and linguistic foundations, as well as detailed discussion of statistical methods, allowing students and researchers to construct their own implementations. The book covers collocation finding, word sense disambiguation, probabilistic parsing, information retrieval, and other applications.
The process of user-centered innovation: how it can benefit both users and manufacturers and how its emergence will bring changes in business models and in public policy.Innovation is rapidly becoming democratized. Users, aided by improvements in computer and communications technology, increasingly can develop their own new products and services. These innovating users—both individuals and firms—often freely share their innovations with others, creating user-innovation communities and a rich intellectual commons. In Democratizing Innovation, Eric von Hippel looks closely at this emerging system of user-centered innovation. He explains why and when users find it profitable to develop new products and services for themselves, and why it often pays users to reveal their innovations freely for the use of all.The trend toward democratized innovation can be seen in software and information products—most notably in the free and open-source software movement—but also in physical products. Von Hippel's many examples of user innovation in action range from surgical equipment to surfboards to software security features. He shows that product and service development is concentrated among "lead users," who are ahead on marketplace trends and whose innovations are often commercially attractive.Von Hippel argues that manufacturers should redesign their innovation processes and that they should systematically seek out innovations developed by users. He points to businesses—the custom semiconductor industry is one example—that have learned to assist user-innovators by providing them with toolkits for developing new products. User innovation has a positive impact on social welfare, and von Hippel proposes that government policies, including R&D subsidies and tax credits, should be realigned to eliminate biases against it. The goal of a democratized user-centered innovation system, says von Hippel, is well worth striving for. An electronic version of this book is available under a Creative Commons license.
In Always Already New, Lisa Gitelman explores the newness of new media while she asks what it means to do media history. Using the examples of early recorded sound and digital networks, Gitelman challenges readers to think about the ways that media work as the simultaneous subjects and instruments of historical inquiry. Presenting original case studies of Edison's first phonographs and the Pentagon's first distributed digital network, the ARPANET, Gitelman points suggestively toward similarities that underlie the cultural definition of records (phonographic and not) at the end of the nineteenth century and the definition of documents (digital and not) at the end of the twentieth. As a result, Always Already New speaks to present concerns about the humanities as much as to the emergent field of new media studies. Records and documents are kernels of humanistic thought, after all—part of and party to the cultural impulse to preserve and interpret. Gitelman's argument suggests inventive contexts for "humanities computing" while also offering a new perspective on such traditional humanities disciplines as literary history. Making extensive use of archival sources, Gitelman describes the ways in which recorded sound and digitally networked text each emerged as local anomalies that were yet deeply embedded within the reigning logic of public life and public memory. In the end Gitelman turns to the World Wide Web and asks how the history of the Web is already being told, how the Web might also resist history, and how using the Web might be producing the conditions of its own historicity.
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.