Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
The first reference on rationality that integrates accounts from psychology and philosophy, covering descriptive and normative theories from both disciplines.Both analytic philosophy and cognitive psychology have made dramatic advances in understanding rationality, but there has been little interaction between the disciplines. This volume offers the first integrated overview of the state of the art in the psychology and philosophy of rationality. Written by leading experts from both disciplines, The Handbook of Rationality covers the main normative and descriptive theories of rationality—how people ought to think, how they actually think, and why we often deviate from what we can call rational. It also offers insights from other fields such as artificial intelligence, economics, the social sciences, and cognitive neuroscience. The Handbook proposes a novel classification system for researchers in human rationality, and it creates new connections between rationality research in philosophy, psychology, and other disciplines. Following the basic distinction between theoretical and practical rationality, the book first considers the theoretical side, including normative and descriptive theories of logical, probabilistic, causal, and defeasible reasoning. It then turns to the practical side, discussing topics such as decision making, bounded rationality, game theory, deontic and legal reasoning, and the relation between rationality and morality. Finally, it covers topics that arise in both theoretical and practical rationality, including visual and spatial thinking, scientific rationality, how children learn to reason rationally, and the connection between intelligence and rationality.
The trajectories of pollution in global capitalism, from the toxic waste of early tanneries to the poisonous effects of pesticides in the twentieth century.Through the centuries, the march of economic progress has been accompanied by the spread of industrial pollution. As our capacities for production and our aptitude for consumption have increased, so have their byproducts—chemical contamination from fertilizers and pesticides, diesel emissions, oil spills, a vast “plastic continent” found floating in the ocean. The Contamination of the Earth offers a social and political history of industrial pollution, mapping its trajectories over three centuries, from the toxic wastes of early tanneries to the fossil fuel energy regime of the twentieth century.The authors describe how, from 1750 onward, in contrast to the early modern period, polluted water and air came to be seen as inevitable side effects of industrialization, which was universally regarded as beneficial. By the nineteenth century, pollutants became constituent elements of modernity. The authors trace the evolution of these various pollutions, and describe the ways in which they were simultaneously denounced and permitted. The twentieth century saw new and massive scales of pollution: chemicals that resisted biodegradation, including napalm and other defoliants used as weapons of war; the ascendancy of oil; and a lifestyle defined by consumption. In the 1970s, pollution became a political issue, but efforts—local, national, and global—to regulate it often fell short. Viewing the history of pollution though a political lens, the authors also offer lessons for the future of the industrial world.
"The story of how legendary radio station WBCN (and by extension the city of Boston) emerged as a central crossroads of the 1960s counterculture and political activism"--
An accessible explanation of the technologies that enable such popular voice-interactive applications as Alexa, Siri, and Google Assistant.Have you talked to a machine lately? Asked Alexa to play a song, asked Siri to call a friend, asked Google Assistant to make a shopping list? This volume in the MIT Press Essential Knowledge series offers a nontechnical and accessible explanation of the technologies that enable these popular devices. Roberto Pieraccini, drawing on more than thirty years of experience at companies including Bell Labs, IBM, and Google, describes the developments in such fields as artificial intelligence, machine learning, speech recognition, and natural language understanding that allow us to outsource tasks to our ubiquitous virtual assistants.Pieraccini describes the software components that enable spoken communication between humans and computers, and explains why it's so difficult to build machines that understand humans. He explains speech recognition technology; problems in extracting meaning from utterances in order to execute a request; language and speech generation; the dialog manager module; and interactions with social assistants and robots. Finally, he considers the next big challenge in the development of virtual assistants: building in more intelligence--enabling them to do more than communicate in natural language and endowing them with the capacity to know us better, predict our needs more accurately, and perform complex tasks with ease.
"The authors examine the implications of AI for the future of life and work, and how this might change the structure and environment of high school education"--
The imagined histories of twenty-five architectural drawings and models, told through reminiscences, stories, conversations, letters, and monologues.Even when an architectural drawing does not show any human figures, we can imagine many different characters just off the page: architects, artists, onlookers, clients, builders, developers, philanthropists—working, observing, admiring, arguing. In Stories from Architecture, Philippa Lewis captures some of these personalities through reminiscences, anecdotes, conversations, letters, and monologues that collectively offer the imagined histories of twenty-five architectural drawings. Some of these untold stories are factual, like Frank Lloyd Wright’s correspondence with a Wisconsin librarian regarding her $5,000 dream home, or letters written by the English architect John Nash to his irascible aristocratic client. Others recount a fictional, if credible, scenario by placing these drawings—and with them their characters—into their immediate social context. For instance, the dilemmas facing a Regency couple who are considering a move to a suburban villa; a request from the office of Richard Neutra for an assistant to measure Josef von Sternberg’s Rolls-Royce so that the director’s beloved vehicle might fit into the garage being designed by his architect; a teenager dreaming of a life away from parental supervision by gazing at a gadget-filled bachelor pad in Playboy magazine; even a policeman recording the ground plans of the house of a murder scene. The drawings, reproduced in color, are all sourced from the Drawing Matter collection in Somerset, UK, and are fascinating objects in themselves; but Lewis shifts our attention beyond the image to other possible histories that linger, invisible, beyond the page, and in the process animates not just a series of archival documents but the writing of architectural history.
The story of the arcane table-top game that became a pop culture phenomenon and the long-running legal battle waged by its cocreators.When Dungeons & Dragons was first released to a small hobby community, it hardly seemed destined for mainstream success--and yet this arcane tabletop role-playing game became an unlikely pop culture phenomenon. In Game Wizards, Jon Peterson chronicles the rise of Dungeons & Dragons from hobbyist pastime to mass market sensation, from the initial collaboration to the later feud of its creators, Gary Gygax and Dave Arneson. As the game''s fiftieth anniversary approaches, Peterson--a noted authority on role-playing games--explains how D&D and its creators navigated their successes, setbacks, and controversies.Peterson describes Gygax and Arneson''s first meeting and their work toward the 1974 release of the game; the founding of TSR and its growth as a company; and Arneson''s acrimonious departure and subsequent challenges to TSR. He recounts the "Satanic Panic" accusations that D&D was sacrilegious and dangerous, and how they made the game famous. And he chronicles TSR''s reckless expansion and near-fatal corporate infighting, which culminated with the company in debt and overextended and the end of Gygax''s losing battle to retain control over TSR and D&D.With Game Wizards, Peterson restores historical particulars long obscured by competing narratives spun by the one-time partners. That record amply demonstrates how the turbulent experience of creating something as momentous as Dungeons & Dragons can make people remember things a bit differently from the way they actually happened.
A novel, systematic theory of adjunct control, explaining how and why adjuncts shift between obligatory and nonobligatory control.Control in adjuncts involves a complex interaction of syntax, semantics, and pragmatics, which so far has resisted systematic analysis. In this book, Idan Landau offers the first comprehensive account of adjunct control. Extending the framework developed in his earlier book, A Two-Tiered Theory of Control, Landau analyzes ten different types of adjuncts and shows that they fall into two categories: those displaying strict obligatory control (OC) and those alternating between OC and nonobligatory control (NOC). He explains how and why adjuncts shift between OC and NOC, unifying their syntactic, semantic, and pragmatic properties.Landau shows that the split between the two types of adjuncts reflects a fundamental distinction in the semantic type of the adjunct: property (OC) or proposition (NOC), a distinction independently detectable by the adjunct''s tolerance to a lexical subject. After presenting a fully compositional account of controlled adjuncts, Landau tests and confirms the specific configurational predictions for each type of adjunct. He describes the interplay between OC and NOC in terms of general principles of competition--both within the grammar and outside of it, in the pragmatics and in the processing module--shedding new light on classical puzzles in the acquisition of adjunct control by children. Along the way, he addresses a range of empirical phenomena, including implicit arguments, event control, logophoricity, and topicality.
A lavishly illustrated catalog of space technology of the future: lab-tested devices, experiments, and habitats for the age of participatory space exploration.As Earthlings, we stand on the brink of a new age: the Anthropocosmos—an era of space exploration in which we can expand humanity’s horizons beyond our planet’s bounds. And in this new era, we have twin responsibilities, to Earth and to space; we should neither abandon our own planet to environmental degradation nor litter the galaxy with space junk. This fascinating and generously illustrated volume—designed by MIT Media Lab researcher Sands Fish—presents space technology for this new age: prototypes, artifacts, experiments, and habitats for an era of participatory space exploration. These projects, developed as part of MIT’s Space Exploration Initiative, range from nanoscale imaging of microbes to responsive, sensor-mediated living environments. They show the usefulness of a seahorse tail for humans in microgravity, document the promise of shape-memory alloys for CubeSat in-orbit maneuvering, and introduce TESSERAE (Tessellated Electromagnetic Space Structures for the Exploration of Reconfigurable, Adaptive Environments), self-assembling space architecture. Some are ongoing, real-world systems: an art payload sent to the International Space Station via Space X CRS-20, for example, and a crowdsourced interplanetary cookbook. More than forty large-format, coffee table book–quality, full-color photographs make our future in space seem palpable. Short explanatory texts by Ariel Ekblaw, astronaut Cady Coleman, and others accompany the images.
From Go Fund Me to philanthropy: the everyday ways that we can give our money, our time, and even our data to help our communities and seek justice.In How We Give Now, Lucy Bernholz shows that philanthropy is more than writing a check and claiming a tax deduction. For most of us--the non-wealthy givers--philanthropy can be a way of living our values and fully participating in society. We give in all kinds of ways--shopping at certain businesses, canvassing for candidates, donating money, and making conscious choices with our retirement funds. We give our cash, our time, and even our data to make the world a better place. Bernholz takes readers on a tour of the often-overlooked worlds of participatory philanthropy, learning from a diverse group of forty resourceful givers.Donating our digitized personal data is an emerging form of philanthropy, and Bernholz describes safe, equitable, and effective ways of doing so--giving genetic data for medical research through a nonprofit genetics organization rather than a commercial one, for example, or contributing photographs to an online archive like the Densho Digital Repository, which documents America's internment of 120,000 Americans of Japanese descent. Bernholz tells us to "follow the money," however, when we're asked to "add a dollar" to our total at the cash register, or when we buy a charity-branded product; it's more effective to give directly than to give while shopping.Giving is a form of participation. Philanthropy by the rest of us--across geographies and cultural traditions--begins with and builds on active commitment to our communities.
An examination of machine learning art and its practice in new media art and music. Over the past decade, an artistic movement has emerged that draws on machine learning as both inspiration and medium. In this book, transdisciplinary artist-researcher Sofian Audry examines artistic practices at the intersection of machine learning and new media art, providing conceptual tools and historical perspectives for new media artists, musicians, composers, writers, curators, and theorists. Audry looks at works from a broad range of practices, including new media installation, robotic art, visual art, electronic music and sound, and electronic literature, connecting machine learning art to such earlier artistic practices as cybernetics art, artificial life art, and evolutionary art. Machine learning underlies computational systems that are biologically inspired, statistically driven, agent-based networked entities that program themselves. Audry explains the fundamental design of machine learning algorithmic structures in terms accessible to the nonspecialist while framing these technologies within larger historical and conceptual spaces. Audry debunks myths about machine learning art, including the ideas that machine learning can create art without artists and that machine learning will soon bring about superhuman intelligence and creativity. Audry considers learning procedures, describing how artists hijack the training process by playing with evaluative functions; discusses trainable machines and models, explaining how different types of machine learning systems enable different kinds of artistic practices; and reviews the role of data in machine learning art, showing how artists use data as a raw material to steer learning systems and arguing that machine learning allows for novel forms of algorithmic remixes.
Why women’s voices are outnumbered online and what we can do about it, by a New York Times comment moderator.If you’ve read the comments posted by readers of online news sites, you may have noticed the absence of women’s voices. Men are by far the most prolific commenters on politics and public affairs. When women do comment, they are often attacked or dismissed more than men are. In fact, the comment forums on news sites replicate conditions of the offline and social media worlds, where women are routinely interrupted, threatened, demeaned, and called wrong, unruly, disgusting, and out of place. In Digital Suffragists, Marie Tessier—a veteran journalist and a New York Times comment moderator for more than a decade—investigates why women’s voices are outnumbered online and what we can do about it. The suffragists of the early twentieth century were jailed for trying to vote. Can a twenty-first century democracy be functional when half of the population is not fully represented in a primary form of political communication? Tessier shows that for online comments, it’s a design problem: the linear blog comment formula was based on deeply gender-biased assumptions. Technologies designed with a broad range of end users in mind, she points out, are more successful and beneficial than those that reflect the designer’s own habits of mind. Tessier outlines benchmarks for a more democratic media, all of which stem from one fundamental idea: media must adopt gender and racial representation as key performance indicators. Equal speaking time for women is a measure of democracy.
How the internet disrupted the recorded music, newspaper, film, and television industries and what this tells us about surviving technological disruption.Much of what we think we know about how the internet "disrupted" media industries is wrong. Piracy did not wreck the recording industry, Netflix isn''t killing Hollywood movies, and information does not want to be free. In Media Disrupted, Amanda Lotz looks at what really happened when the recorded music, newspaper, film, and television industries were the ground zero of digital disruption. It''s not that digital technologies introduced "new media," Lotz explains; rather, they offered existing media new tools for reaching people. For example, the MP3 unbundled recorded music; as the internet enabled new ways for people to experience and pay for music, the primary source of revenue for the recorded music industry shifted from selling music to licensing it. Cable television providers, written off as predigital dinosaurs, became the dominant internet service providers. News organizations struggled to remake businesses in the face of steep declines in advertiser spending, while the film industry split its business among movies that compelled people to go to theaters and others that are better suited for streaming. Lotz looks in detail at how and why internet distribution disrupted each industry. The stories of business transformation she tells offer lessons for surviving and even thriving in the face of epoch-making technological change.
The first monograph on an important young American artist, generously illustrated with color images of his work.In his sculptures and installations, Matthew Angelo Harrison (b. 1989) engages with the legacies of racism and colonialism, parsing their contemporary connections to labor in the United States through an evolving visual language. With works that merge manufacturing technologies with the formal concerns of modernism and minimalism, the artist questions ideas of authorship and reproduction. Harrison''s sculptures often include found objects--including traditional African figurines and auto industry ephemera--encased in resin blocks. Frozen and entombed, these sculptures appear as strangely haunted minimalist objects, both ancient and futuristic. This generously illustrated volume, published in conjunction with two major solo exhibitions, is the first monograph on an important young American artist. Another specter haunting Harrison''s work is that of Detroit''s defunct auto industry. A native of Detroit who once worked making prototypes in an auto manufacturing plant, Harrison sometimes employs precision machine-tooling techniques that are derived from those used by auto makers. In other works, Harrison replicates rare African masks and sculptures using hand-built, low-resolution 3D printing machines, rendering large-scale forms in wet clay--fragile, imperfect, and subject to glitches. In addition to color photos of Harrison’s work and images that illustrate the artist’s relationship to Detroit, the book features essays by curators and art historians Jessica Bell Brown and Elena Filipovic, as well as a conversation between Harrison and musician and theorist DeForrest Brown, Jr., led by curator Taylor Renee Aldridge. ContributorsNatalie Bell, Elena Filipovic, Jessica Bell Brown, Taylor Renee Aldridge, DeForrest Brown Jr., Matthew Angelo Harrison
How the treatment of sexual consent in erotic fanfiction functions as a form of cultural activism.Sexual consent is--at best--a contested topic in Western societies and cultures. The #MeToo movement has brought public attention to issues of sexual consent, revealing the endemic nature of sexual violence. Feminist academic approaches to sexual violence and consent are diverse and multidisciplinary--and yet consent itself is significantly undertheorized. In Dubcon, Milena Popova points to a community that has been considering issues of sex, power, and consent for many years: writers and readers of fanfiction. Their nuanced engagement with sexual consent, Popova argues, can shed light on these issues in ways not available to either academia or journalism. Popova explains that the term "dubcon" (short for "dubious consent") was coined by the fanfiction community to make visible the gray areas between rape and consent--for example, in situations where the distribution of power may limit an individual''s ability to give meaningful consent to sex. Popova offers a close reading of three fanfiction stories in the Omegaverse genre, examines the "arranged marriage" trope, and discusses the fanfiction community''s response when a sports star who was a leading character in RPF (real person fiction) was accused of rape. Proposing that fanfiction offers a powerful discursive resistance on issues of rape and consent that challenges dominant discourses about gender, romance, sexuality, and consent, Popova shows that fanfiction functions as a form of cultural activism.
A novel account of the evolution of language and the cognitive capacities on which language depends.In From Signal to Symbol, Ronald Planer and Kim Sterelny propose a novel theory of language: that modern language is the product of a long series of increasingly rich protolanguages evolving over the last two million years. Arguing that language and cognition coevolved, they give a central role to archaeological evidence and attempt to infer cognitive capacities on the basis of that evidence, which they link in turn to communicative capacities. Countering other accounts, which move directly from archaeological traces to language, Planer and Sterelny show that rudimentary forms of many of the elements on which language depends can be found in the great apes and were part of the equipment of the earliest species in our lineage. After outlining the constraints a theory of the evolution of language should satisfy and filling in the details of their model, they take up the evolution of words, composite utterances, and hierarchical structure. They consider the transition from a predominantly gestural to a predominantly vocal form of language and discuss the economic and social factors that led to language. Finally, they evaluate their theory in terms of the constraints previously laid out.
An expert on mind considers how animals and smart machines measure up to human intelligence.Octopuses can open jars to get food, and chimpanzees can plan for the future. An IBM computer named Watson won on Jeopardy! and Alexa knows our favorite songs. But do animals and smart machines really have intelligence comparable to that of humans? In Bots and Beasts, Paul Thagard looks at how computers ("bots") and animals measure up to the minds of people, offering the first systematic comparison of intelligence across machines, animals, and humans. Thagard explains that human intelligence is more than IQ and encompasses such features as problem solving, decision making, and creativity. He uses a checklist of twenty characteristics of human intelligence to evaluate the smartest machines--including Watson, AlphaZero, virtual assistants, and self-driving cars--and the most intelligent animals--including octopuses, dogs, dolphins, bees, and chimpanzees. Neither a romantic enthusiast for nonhuman intelligence nor a skeptical killjoy, Thagard offers a clear assessment. He discusses hotly debated issues about animal intelligence concerning bacterial consciousness, fish pain, and dog jealousy. He evaluates the plausibility of achieving human-level artificial intelligence and considers ethical and policy issues. A full appreciation of human minds reveals that current bots and beasts fall far short of human capabilities.
The inside story of the groundbreaking experiment that captured what people think about the life-and-death dilemmas posed by driverless cars.Human drivers don''t find themselves facing such moral dilemmas as "should I sacrifice myself by driving off a cliff if that could save the life of a little girl on the road?" Human brains aren''t fast enough to make that kind of calculation; the car is over the cliff in a nanosecond. A self-driving car, on the other hand, can compute fast enough to make such a decision--to do whatever humans have programmed it to do. But what should that be? This book investigates how people want driverless cars to decide matters of life and death. In The Car That Knew Too Much, psychologist Jean-François Bonnefon reports on a groundbreaking experiment that captured what people think cars should do in situations where not everyone can be saved. Sacrifice the passengers for pedestrians? Save children rather than adults? Kill one person so many can live? Bonnefon and his collaborators Iyad Rahwan and Azim Shariff designed the largest experiment in moral psychology ever: the Moral Machine, an interactive website that has allowed people --eventually, millions of them, from 233 countries and territories--to make choices within detailed accident scenarios. Bonnefon discusses the responses (reporting, among other things, that babies, children, and pregnant women were most likely to be saved), the media frenzy over news of the experiment, and scholarly responses to it. Boosters for driverless cars argue that they will be in fewer accidents than human-driven cars. It''s up to humans to decide how many fatal accidents we will allow these cars to have.
An expert on computer privacy and security shows how we can build privacy into the design of systems from the start.We are tethered to our devices all day, every day, leaving data trails of our searches, posts, clicks, and communications. Meanwhile, governments and businesses collect our data and use it to monitor us without our knowledge. So we have resigned ourselves to the belief that privacy is hard--choosing to believe that websites do not share our information, for example, and declaring that we have nothing to hide anyway. In this informative and illuminating book, a computer privacy and security expert argues that privacy is not that hard if we build it into the design of systems from the start. Along the way, Jaap-Henk Hoepman debunks eight persistent myths surrounding computer privacy. The website that claims it doesn''t collect personal data, for example; Hoepman explains that most data is personal, capturing location, preferences, and other information. You don''t have anything to hide? There''s nothing wrong with wanting to keep personal information--even if it''s not incriminating or embarrassing--private. Hoepman shows that just as technology can be used to invade our privacy, it can be used to protect it, when we apply privacy by design. Hoepman suggests technical fixes, discussing pseudonyms, leaky design, encryption, metadata, and the benefits of keeping your data local (on your own device only), and outlines privacy design strategies that system designers can apply now.
How augmented reality and virtual reality are taking their places in contemporary media culture alongside film and television.TThis book positions augmented reality (AR) and virtual reality (VR) firmly in contemporary media culture. The authors view AR and VR not as the latest hyped technologies but as media—the latest in a series of what they term “reality media,” taking their places alongside film and television. Reality media inserts a layer of media between us and our perception of the world; AR and VR do not replace reality but refashion a reality for us. Each reality medium mediates and remediates; each offers a new representation that we implicitly compare to our experience of the world in itself but also through other media. The authors show that as forms of reality media emerge, they not only chart a future path for media culture, but also redefine media past. With AR and VR in mind, then, we can recognize their precursors in eighteenth-century panoramas and the Broadway lights of the 1930s. A digital version of Reality Media, available through the book’s website, invites readers to visit a series of virtual rooms featuring interactivity, 3-D models, videos, images, and texts that explore the themes of the book.
Provocative, hopeful essays imagine a future that is not reduced to algorithms.What is human flourishing in an age of machine intelligence, when many claim that the world’s most complex problems can be reduced to narrow technical questions? Does more computing make us more intelligent, or simply more computationally powerful? We need not always resist reduction; our ability to simplify helps us interpret complicated situations. The trick is to know when and how to do so. Against Reduction offers a collection of provocative and illuminating essays that consider different ways of recognizing and addressing the reduction in our approach to artificial intelligence, and ultimately to ourselves.Inspired by a widely read manifesto by Joi Ito that called for embracing the diversity and irreducibility of the world, these essays offer persuasive and compelling variations on resisting reduction. Among other things, the writers draw on indigenous epistemology to argue for an extended “circle of relationships” that includes the nonhuman and robotic; cast “Snow White” as a tale of AI featuring a smart mirror; point out the cisnormativity of security protocol algorithms; map the interconnecting networks of so-called noncommunicable disease; and consider the limits of moral mathematics. Taken together, they show that we should push back against some of the reduction around us and do whatever is in our power to work toward broader solutions.
How to empower people and communities with user-centric data ownership, transparent and accountable algorithms, and secure digital transaction systems. Data is now central to the economy, government, and health systems—so why are data and the AI systems that interpret the data in the hands of so few people? Building the New Economy calls for us to reinvent the ways that data and artificial intelligence are used in civic and government systems. Arguing that we need to think about data as a new type of capital, the authors show that the use of data trusts and distributed ledgers can empower people and communities with user-centric data ownership, transparent and accountable algorithms, machine learning fairness principles and methodologies, and secure digital transaction systems. It’s well known that social media generate disinformation and that mobile phone tracking apps threaten privacy. But these same technologies may also enable the creation of more agile systems in which power and decision-making are distributed among stakeholders rather than concentrated in a few hands. Offering both big ideas and detailed blueprints, the authors describe such key building blocks as data cooperatives, tokenized funding mechanisms, and tradecoin architecture. They also discuss technical issues, including how to build an ecosystem of trusted data, the implementation of digital currencies, and interoperability, and consider the evolution of computational law systems.
The first English translation of a nonfiction work by Stanislaw Lem, which was "conceived under the spell of cybernetics" in 1957 and updated in 1971.In 1957, Stanislaw Lem published Dialogues, a book "conceived under the spell of cybernetics," as he wrote in the preface to the second edition. Mimicking the form of Berkeley's Three Dialogues between Hylas and Philonous, Lem's original dialogue was an attempt to unravel the then-novel field of cybernetics. It was a testimony, Lem wrote later, to "the almost limitless cognitive optimism" he felt upon his discovery of cybernetics. This is the first English translation of Lem's Dialogues, including the text of the first edition and the later essays added to the second edition in 1971. For the second edition, Lem chose not to revise the original. Recognizing the naivete of his hopes for cybernetics, he constructed a supplement to the first dialogue, which consists of two critical essays, the first a summary of the evolution of cybernetics, the second a contribution to the cybernetic theory of the "sociopathology of governing," amending the first edition's discussion of the pathology of social regulation; and two previously published articles on related topics. From the vantage point of 1971, Lem observes that original book, begun as a search for methods "that would increase our understanding of both the human and nonhuman worlds," was in the end "an expression of the cognitive curiosity and anxiety of modern thought."
"Written by Michael Jacobson, Ph.D., one of the most prominent advocates for sodium reduction since the 1970s, this book is a clarion call for radical change in America's relationship to salt"--
An accessible guide to cybersecurity for the everyday user, covering cryptography and public key infrastructure, malware, blockchain, and other topics.It seems that everything we touch is connected to the internet, from mobile phones and wearable technology to home appliances and cyber assistants. The more connected our computer systems, the more exposed they are to cyber attacks--attempts to steal data, corrupt software, disrupt operations, and even physically damage hardware and network infrastructures. In this volume of the MIT Press Essential Knowledge series, cybersecurity expert Duane Wilson offers an accessible guide to cybersecurity issues for everyday users, describing risks associated with internet use, modern methods of defense against cyber attacks, and general principles for safer internet use.Wilson describes the principles that underlie all cybersecurity defense: confidentiality, integrity, availability, authentication, authorization, and non-repudiation (validating the source of information). He explains that confidentiality is accomplished by cryptography; examines the different layers of defense; analyzes cyber risks, threats, and vulnerabilities; and breaks down the cyber kill chain and the many forms of malware. He reviews some online applications of cybersecurity, including end-to-end security protection, secure ecommerce transactions, smart devices with built-in protections, and blockchain technology. Finally, Wilson considers the future of cybersecurity, discussing the continuing evolution of cyber defenses as well as research that may alter the overall threat landscape.
An accessible introduction to a concept often considered impossibly abstruse, demonstrating its power as a conceptual tool in the twenty-first century.This volume in the MIT Press Essential Knowledge series offers a clear and concise introduction to a topic often considered difficult and abstruse: deconstruction. David Gunkel sorts out the concept, terminology, and practices of deconstruction, not to defend academic orthodoxy, or to disseminate the thought of Jacques Derrida--the fabricator of the neologism and progenitor of the concept--but to provide readers with a powerful conceptual tool for the twenty-first century.Gunkel explains that deconstruction is not simply the opposite of construction--the "deconstructed" jacket hanging in your closet is not, strictly speaking, accurately named--or synonymous with destruction. It is a way to think beyond the construction/destruction dichotomy and all other conceptual dichotomies and logical oppositions. After describing what deconstruction is not, and developing an abstract and schematic characterization derived from Derrida, Gunkel offers examples in (rather than of) deconstruction, including logocentrism (the speech/writing dichotomy) and virtuality (the ruling philosophical binary of real/appearance), remix (the original/copy distinction), and the posthuman figure of the cyborg (the human/machine conceptual pairing). Finally, Gunkel discusses the costs and benefits of deconstruction, considering the many things deconstruction is good for and identifying potential problems, including Eurocentrism, relativism, difficulties in communicating the concept, and reappropriation.
Combining handbook, dictionary, and anthology, investigations and examples of artistic practices aimed at social change.This volume from from BAK, basis voor actuele kunst, combines handbook, dictionary, and anthology to investigate artistic practice aimed at achieving social change. With text and visual essays, definitions, exercises, interviews, and images, the contributors envision a praxis that is committed to experimenting with aesthetics and politics in ways that go beyond the conventions of Western modernity. These are practices that are interdisciplinary, theoretically informed, and politically-driven, offering ways of "being together otherwise." Catalyzed by the work of artist Jeanne van Heeswijk, which focuses on radicalizing civic processes, Toward the Not-Yet imagines and enacts alternative ways of conceiving the present and future.Contributors, among them notable artists, scholars, activists, and writers consider ways of participating in civic life, including "dreamscaping" and "radical listening"; the creation of safer spaces for humans and nonhumans; ways of radically shifting laws and policies; and tactics and methods of collective sanctuary. Toward the Not-Yet is part of BAK's series of BASICS readers, debuting a SUPERBASICS variation that is larger, with more visual content.Copublished with BAK, basis voor actuele kunst
An examination of the contemporary medicalization of death and dying that calls us to acknowledge instead death's existential and emotional realities.Death is a natural, inevitable, and deeply human process, and yet Western medicine tends to view it as a medical failure. In their zeal to prevent death, physicians and hospitals often set patients and their families on a seemingly unstoppable trajectory toward medical interventions that may actually increase suffering at the end of life. This volume in the MIT Press Essential Knowledge series examines the medicalization of death and dying and proposes a different approach--one that acknowledges death's existential and emotional realities.The authors--one an academic who teaches and studies end-of-life care, and the other a physician trained in hospice and palliative care--offer an account of Western-style death and dying that is informed by both research and personal experience. They examine the medical profession's attitude toward death as a biological dysfunction that needs fixing; describe the hospice movement, as well as movements for palliative care and aid in dying, and why they failed to influence mainstream medicine; consider our reluctance to have end-of-life conversations; and investigate the commodification of medicine and the business of dying. To help patients die in accordance with their values, they say, those who care for the dying should focus less on delaying death by any means possible and more on being present with the dying on their journey.
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.