Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
A critical edition of a classic work by the renowned philosopher George Santayana evaluating key movements in American intellectual history.Winds of Doctrine presents six essays by the internationally recognized critic and philosopher George Santayana. The essays, edited by David E. Spiech, Martin A. Coleman, and Faedra Lazar Weiss, and introduced by Paul Forster, address the broad sweep of intellectual trends—or, as the title suggests, the ever-changing winds of thought—of the Spanish-born American thinker’s time. The topics range from the secularization of American culture to the rise of religious modernism to the “genteel tradition” in American philosophy, the subject of Santayana’s final lecture in America and perhaps his best known essay. The original Winds of Doctrine, published in 1913, was the first book published after Santayana’s 1912 departure for Europe. Santayana had felt stifled at Harvard for some time, and his long-contemplated resignation from academia released him from previous obligations and allowed him a new freedom to think and write. Much later, Santayana remarked on the significance of that choice to step away: “In Winds of Doctrine and my subsequent books, a reader of my earlier writings may notice a certain change of climate. . . . It was not my technical philosophy that was principally affected, but rather the meaning and status of philosophy for my inner man.”An insightful document of American intellectual history, supplemented with annotations and rich textual commentary, Winds of Doctrine is a vital and engaging survey of the religious, political, philosophical, and literary trends of the twentieth century.
A clear, comprehensive, and rigorous introduction to the theory of computation.What is computable? What leads to efficiency in computation? Computability and Complexity offers a clear, comprehensive, and rigorous introduction to the mathematical study of the capabilities and limitations of computation. Hubie Chen covers the core notions, techniques, methods, and questions of the theory of computation before turning to several advanced topics. Emphasizing intuitive learning and conceptual discussion, this textbook’s accessible approach offers a robust foundation for understanding both the reach and restrictions of algorithms and computers.Extensive exercises and diagrams enhance streamlined, student-friendly presentation of mathematically rigorous materialIncludes thorough treatment of automata theory, computability theory, and complexity theory—including the P versus NP question and the theory of NP-completenessSuitable for undergraduate and graduate students, researchers, and professionals
From a former Chief Economist of the World Bank, a brief, balanced, and sobering discussion of globalization trends, their drivers, and effects on inequality.The recent retreat from globalization has been triggered by a perception that increased competition from global trade is not fair and leads to increased inequality within countries. Is this phenomenon a small hiccup in the overall wave of globalization, or are we at the beginning of a new era of deglobalization? Former Chief Economist of the World Bank Group Pinelopi Koujianou Goldberg tells us that the answer depends on the policy choices we make, and in this book The Unequal Effects of Globalization, she calls for exploring alternative policy approaches including place-based policies, while sustaining international cooperation.At this critical moment of shifting attitudes toward globalization, The Unequal Effects of Globalization enters the debate while also taking a step back. Goldberg investigates globalization’s many dimensions, disruptions, and complex interactions, from the late twentieth century’s wave of trade liberalizations to the rise of China, the decline of manufacturing in advanced economies, and the recent effects of trade on global poverty, inequality, labor markets, and firm dynamics. From there, Goldberg explores the significance of the recent backlash against and potential retreat from globalization, and considers the key policy implications of these trends and emerging dynamics.As comprehensive as it is well-balanced, The Unequal Effects of Globalization is an essential read on trade and cooperation between nations that will appeal as much to academics and policymakers as it will to general readers who are interested in learning more about this timely subject.
A creative and comprehensive exploration of the institutional forces undermining the management of environments critical to public health.For almost two decades, the citizens of Western Mexico have called for a cleanup of the Santiago River, a water source so polluted it emanates an overwhelming acidic stench. Toxic clouds of foam lift off the river in a strong wind. In Sewer of Progress, Cindy McCulligh examines why industrial dumping continues in the Santiago despite the corporate embrace of social responsibility and regulatory frameworks intended to mitigate environmental damage. The fault, she finds, lies in a disingenuous discourse of progress and development that privileges capitalist growth over the health and well-being of ecosystems. Rooted in research on institutional behavior and corporate business practices, Sewer of Progress exposes a type of regulatory greenwashing that allows authorities to deflect accusations of environmental dumping while “regulated” dumping continues in an environment of legal certainty. For transnational corporations, this type of simulation allows companies to take advantage of double standards in environmental regulations, while presenting themselves as socially responsible and green global actors. Through this inversion, the Santiago and other rivers in Mexico have become sewers for urban and industrial waste. Institutionalized corruption, a concept McCulligh introduces in the book, is the main culprit, a system that permits and normalizes environmental degradation, specifically in the creation and enforcement of a regulatory framework for wastewater discharge that prioritizes private interests over the common good.Through a research paradigm based in institutional ethnography and political ecology, Sewer of Progress provides a critical, in-depth look at the power relations subverting the role of the state in environmental regulation and the maintenance of public health.
An innovative, wide-ranging consideration of the global ecological crisis and its deep philosophical and theological roots.Global crises, from melting Arctic ice to ecosystem collapse and the sixth mass extinction, challenge our age-old belief in nature as a phoenix with an infinite ability to regenerate itself from the ashes of destruction. Moving from antiquity to the present and back, Michael Marder provides an integrated examination of philosophies of nature drawn from traditions around the world to illuminate the theological, mythical, and philosophical origins of the contemporary environmental emergency. From there, he probes the contradictions and deadlocks of our current predicament to propose a philosophy of nature for the twenty-first century.As Marder analyzes our reliance on the image and idea of the phoenix to organize our thoughts about the natural world, he outlines the obstacles in the path of formulating a revitalized philosophy of nature. His critical exposition of the phoenix complex draws on Chinese, Indian, Russian, European, and North African traditions. Throughout, Marder lets the figure of the phoenix guide readers through theories of immortality, intergenerational and interspecies relations, infinity compatible with finitude, resurrection, reincarnation, and a possibility of liberation from cycles of rebirth. His concluding remarks on a phoenix-suffused philosophy of nature and political thought extend from the Roman era to the writings of Hannah Arendt.
An insightful investigation into the mechanisms underlying the predictive functions of neural networks—and their ability to chart a new path for AI.Prediction is a cognitive advantage like few others, inherently linked to our ability to survive and thrive. Our brains are awash in signals that embody prediction. Can we extend this capability more explicitly into synthetic neural networks to improve the function of AI and enhance its place in our world? Gradient Expectations is a bold effort by Keith L. Downing to map the origins and anatomy of natural and artificial neural networks to explore how, when designed as predictive modules, their components might serve as the basis for the simulated evolution of advanced neural network systems.Downing delves into the known neural architecture of the mammalian brain to illuminate the structure of predictive networks and determine more precisely how the ability to predict might have evolved from more primitive neural circuits. He then surveys past and present computational neural models that leverage predictive mechanisms with biological plausibility, identifying elements, such as gradients, that natural and artificial networks share. Behind well-founded predictions lie gradients, Downing finds, but of a different scope than those that belong to today’s deep learning. Digging into the connections between predictions and gradients, and their manifestation in the brain and neural networks, is one compelling example of how Downing enriches both our understanding of such relationships and their role in strengthening AI tools. Synthesizing critical research in neuroscience, cognitive science, and connectionism, Gradient Expectations offers unique depth and breadth of perspective on predictive neural-network models, including a grasp of predictive neural circuits that enables the integration of computational models of prediction with evolutionary algorithms.
A framework for knowledge ownership that challenges the mechanisms of inequality in modern society.Scholars of science, technology, medicine, and law have all tended to emphasize knowledge as the sum of human understanding, and its ownership as possession by law. Breaking with traditional discourse on knowledge property as something that concerns mainly words and intellectual history, or science and law, Dagmar Schäfer, Annapurna Mamidipudi, and Marius Buning propose technology as a central heuristic for studying the many implications of knowledge ownership. Toward this end, they focus on the notions of knowledge and ownership in courtrooms, workshops, policy, and research practices, while also shedding light on scholarship itself as a powerful tool for making explicit the politics inherent in knowledge practices and social order. The book presents case studies showing how diverse knowledge economies are created and how inequalities arise from them. Unlike scholars who have fragmented this discourse across the disciplines of anthropology, sociology, and history, the editors highlight recent developments in the emerging field of the global history of knowledge—as science, as economy, and as culture. The case studies reveal how notions of knowing and owning emerge because they reciprocally produce and determine each other’s limits and possibilities; that is, how we know inevitably affects how we can own what we know; and how we own always impacts how and what we are able to know.Contributors (Listed in Order of Appearance)Dagmar Schäfer, Annapurna Mamidipudi, Cynthia Brokaw, Marius Buning, Viren Murthy, Marjolijn Bol, Amy E. Slaton, James Leach, Myles W. Jackson, Lissant Bolton, Vivek S. Oak, Jörn Oeder
A new way to teach media studies that centers students’ lived experiences and diverse perspectives from around the world.From the intimate to the mundane, most aspects of our lives—how we learn, love, work, and play—take place in media. Taking an expansive, global perspective, this introductory textbook covers what it means to live in, rather than with, media. Mark Deuze focuses on the lived experience—how people who use smartphones, the internet, and television sets make sense of their digital environment—to investigate the broader role of media in society and everyday life. Life in Media uses relatable examples and case studies from around the world to illustrate the foundational theories, concepts, and methods of media studies. The book is structured around six core themes: how media inform and inspire our daily activities; how we live our lives in the public eye; how we make distinctions between real and fake; how we seek and express love; how we use media to effect change; how we create media and shared narratives; and how we seek to create well-being within media. By deliberately including diverse voices and radically embracing the everyday and mundane aspects of media life, this book innovates ways to teach and talk about media.Highlights diverse international voices, images, and casesUses accessible examples from everyday life to contextualize theory Offers a comprehensive, student-centered introduction to media studiesExtensively annotated bibliography offers dynamic sources for further study, including readings and documentary films
How marketers learned to dream of optimization and speak in the idiom of management science well before the widespread use of the Internet.Algorithms, data extraction, digital marketers monetizing "eyeballs": these all seem like such recent features of our lives. And yet, Lee McGuigan tells us in this eye-opening book, digital advertising was well underway before the widespread use of the Internet. Explaining how marketers have brandished the tools of automation and management science to exploit new profit opportunities, Selling the American People traces data-driven surveillance all the way back to the 1950s, when the computerization of the advertising business began to blend science, technology, and calculative cultures in an ideology of optimization. With that ideology came adtech, a major infrastructure of digital capitalism.To help make sense of today's attention merchants and choice architects, McGuigan explores a few key questions: How did technical experts working at the intersection of data processing and management sciences come to command the center of gravity in the advertising and media industries? How did their ambition to remake marketing through mathematical optimization shape and reflect developments in digital technology? In short, where did adtech come from, and how did data-driven marketing come to mediate the daily encounters of people, products, and public spheres? His answers show how the advertising industry's efforts to bend information technologies toward its dream of efficiency and rational management helped to make "surveillance capitalism" one of the defining experiences of public life.
A compelling and innovative account that reshapes our view of nineteenth-century chemistry, explaining a critical period in chemistry’s quest to understand and manipulate organic nature.According to existing histories, theory drove chemistry’s remarkable nineteenth-century development. In Molecular World, Catherine M. Jackson shows instead how novel experimental approaches combined with what she calls “laboratory reasoning” enabled chemists to bridge wet chemistry and abstract concepts and, in so doing, create the molecular world. Jackson introduces a series of practice-based breakthroughs that include chemistry’s move into lampworked glassware, the field’s turn to synthesis and subsequent struggles to characterize and differentiate the products of synthesis, and the gradual development of institutional chemical laboratories, an advance accelerated by synthesis and the dangers it introduced.Jackson’s historical reassessment emerges from the investigation of alkaloids by German chemists Justus Liebig, August Wilhelm Hofmann, and Albert Ladenburg. Stymied in his own research, Liebig steered his student Hofmann into pioneering synthesis as a new investigative method. Hofmann’s practice-based laboratory reasoning produced a major theoretical advance, but he failed to make alkaloids. That landmark fell to Ladenburg, who turned to cutting-edge theory only after his successful synthesis.In telling the story of these scientists and their peers, Jackson reveals organic synthesis as the ground chemists stood upon to forge a new relationship between experiment and theory—with far-reaching consequences for chemistry as a discipline.
A history of urban travel demand modeling (UTDM) and its enormous influence on American life from the 1920s to the present.For better and worse, the automobile has been an integral part of the American way of life for decades. Its ascendance would have been far less spectacular, however, had engineers and planners not devised urban travel demand modeling (UTDM). This book tells the story of this irreplaceable engineering tool that has helped cities to accommodate continuous rise in traffic from the 1950s on. Beginning with UTDM’s origins as a method to help plan new infrastructure, Konstantinos Chatzis follows its trajectory through new generations of models that helped make optimal use of existing capacity and examines related policy instruments, including the recent use of intelligent transportation systems.Chatzis investigates these models as evolving entities involving humans and nonhumans that were shaped through a specific production process. In surveying the various generations of UTDM, he delves into various means of production (from tabulating machines to software packages) and travel survey methods (from personal interviews to GPS tracking devices and smartphones) used to obtain critical information. He also looks at the individuals who have collectively built a distinct UTDM social world by displaying specialized knowledge, developing specific skills, and performing various tasks and functions, and by communicating, interacting, and even competing with one another.Original and refreshingly accessible, Forecasting Travel in Urban America offers the first detailed history behind the thinkers and processes that impact the lives of millions of city dwellers every day.
The first reader in critical plant studies, exploring a rapidly growing multidisciplinary field—the intersection of philosophy with plant science and the visual arts.In recent years, philosophy and art have testified to how anthropocentrism has culturally impoverished our world, leading to the wide destruction of habitats and ecosystems. In this book, Giovanni Aloi and Michael Marder show that the field of critical plant studies can make an important contribution, offering a slew of possibilities for scientific research, local traditions, Indigenous knowledge, history, geography, anthropology, philosophy, and aesthetics to intersect, inform one another, and lead interdisciplinary and transcultural dialogues. Vegetal Entwinements in Philosophy and Art considers such topics as the presence of plants in the history of philosophy, the shifting status of plants in various traditions, what it means to make art with growing life-forms, and whether or not plants have moral standing. In an experimental vegetal arrangement, the reader presents some of the most influential writing on plants, philosophy, and the arts, together with provocative new contributions, as well as interviews with groundbreaking contemporary artists whose work has greatly enhanced our appreciation of vegetal being.Contributors:Catriona A.H. Sandilands, Giovanni Aloi, Marlene Atleo, Monica Bakke, Emily Blackmer, Jodi Brandt, Teresa Castro, Dan Choffness, D. Denenge Duyst-Akpem, Mark Dion, Elisabeth E. Schussler, Braden Elliott, Monica Gagliano, Elaine Gan, Prudence Gibson, James H. Wandersee, Manuela Infante, Luce Irigaray, Nicholas J. Reo, Jonathon Keats, Zayaan Khan, Robin Wall Kimmerer, Eduardo Kohn, Stefano Mancuso, Michael Marder, Anguezomo Mba Bikoro, Elaine Miller, Samaneh Moafi, Uriel Orlow, Mark Payne, Allegra Pesenti, Špela Petrič, Michael Pollan, Darren Ranco, Angela Roothaan, Marcela Salinas, Diana Scherer, Vandana Shiva, Linda Tegg, Maria Theresa Alves, Krista Tippet, Anthony Trewavas, Alessandra Viola, Eduardo Viveiros de Castro, B+W, Mathai Wangari, Lois Weinberger, Kyle Whyte, David Wood, Anicka Yi
An innovative historical analysis of the intersection of religion and technology in making the modern state, focusing on bodily production and reproduction across the human-animal divide.In Milk and Honey, Tamar Novick writes a revolutionary environmental history of the state that centers on the intersection of technology and religion in modern Israel/Palestine. Focusing on animals and the management of their production and reproduction across three political regimes—the late-Ottoman rule, British rule, and the early Israeli state—Novick draws attention to the ways in which settlers and state experts used agricultural technology to recreate a biblical idea of past plenitude, literally a “land flowing with milk and honey,” through the bodies of animals and people. Novick presents a series of case studies involving the management of water buffalo, bees, goats, sheep, cows, and peoplein Palestine/Israel. She traces the intimate forms of knowledge and bodily labor—production and reproduction—in which this process took place, and the intertwining of bodily, political, and environmental realms in the transformation of Palestine/Israel. Her wide-ranging approach shows technology never replaced religion as a colonial device. Rather, it merged with settler-colonial aspirations to salvage the land, bolstering the effort to seize control over territory and people.Fusing technology, religious fervor, bodily labor, and political ecology, Milk and Honey provides a novel account of the practices that defined and continue to shape settler-colonialism in the Palestine/Israel, revealing the ongoing entanglement of technoscience and religion in our time.
This book presents empirical methods for studying complex computer programs: exploratory tools to help find patterns in data, experiment designs and hypothesis-testing tools to help data speak convincingly, and modeling tools to help explain data.
"A theorization of habit that emphasizes its excessive, unsettling, and disturbing material and temporal qualities rather than its mediating and stabilizing functions"--
Essays on evolvability from the perspectives of quantitative and population genetics, evolutionary developmental biology, systems biology, macroevolution, and the philosophy of science.Evolvability—the capability of organisms to evolve—wasn’t recognized as a fundamental concept in evolutionary theory until 1990. Though there is still some debate as to whether it represents a truly new concept, the essays in this volume emphasize its value in enabling new research programs and facilitating communication among the major disciplines in evolutionary biology. The contributors, many of whom were instrumental in the development of the concept of evolvability, synthesize what we have learned about it over the past thirty years. They focus on the historical and philosophical contexts that influenced the emergence of the concept and suggest ways to develop a common language and theory to drive further evolvability research. The essays, drawn from a workshop on evolvability hosted in 2019–2020 by the Center of Advanced Study at the Norwegian Academy of Science and Letters, in Oslo, provide scientific and historical background on evolvability. The contributors represent different disciplines of evolutionary biology, including quantitative and population genetics, evolutionary developmental biology, systems biology and macroevolution, as well as the philosophy of science. This pl[urality of approaches allows researchers in disciplines as diverse as developmental biology, molecular biology, and systems biology to communicate with those working in mainstream evolutionary biology. The contributors also discuss key questions at the forefront of research on evolvability.Contributors:J. David Aponte, W. Scott Armbruster, Geir H. Bolstad, Salomé Bourg, Ingo Brigandt, Anne Calof, James M. Cheverud, Josselin Clo, Frietson Galis, Mark Grabowski, Rebecca Green, Benedikt Hallgrímsson, Thomas F. Hansen, Agnes Holstad, David Houle, David Jablonski, Arthur Lander, Arnaud LeRouzic, Alan C. Love, Ralph Marcucio, Michael B. Morrissey, Laura Nuño de la Rosa, Øystein H. Opedal, Mihaela Pavličev, Christophe Pélabon, Jane M. Reid, Heather Richbourg, Jacqueline L. Sztepanacz, Masahito Tsuboi, Cristina Villegas, Marta Vidal-García, Kjetil L. Voje, Andreas Wagner, Günter P. Wagner, Nathan M. Young
From the influential author of Dynamics in Action, how the concepts of constraints provide a way to rethink relationships, opening the way to intentional, meaningful causation.Grounding her work in the problem of causation, Alicia Juarrero challenges previously held beliefs that only forceful impacts are causes. Constraints, she claims, bring about effects as well, and they enable the emergence of coherence. In Context Changes Everything, Juarrero shows that coherence is induced by enabling constraints, not forceful causes, and that the resulting coherence is then maintained by constitutive constraints. Constitutive constraints, in turn, become governing constraints that regulate and modulate the way coherent entities behave. Using the tools of complexity science, she offers a rigorously scientific understanding of identity, hierarchy, and top-down causation, and in so doing, presents a new way of thinking about the natural world. Juarrero argues that personal identity, which has been thought to be conferred through internal traits (essential natures), is grounded in dynamic interdependencies that keep coherent structures whole. This challenges our ideas of identity, as well as the notion that stability means inflexible rigidity. On the contrary, stable entities are brittle and cannot persist. Complexity science, says Juarrero, can shape how we meet the world, how what emerges from our interactions finds coherence, and how humans can shape identities that are robust and resilient. This framework has significant implications for sociology, economics, political theory, business, and knowledge management, as well as psychology, religion, and theology. It points to a more expansive and synthetic philosophy about who we are and about the coherence of living and nonliving things alike.
How we can enact meaningful change in computing to meet the urgent need for sustainability and justice.The deep entanglement of information technology with our societies has raised hope for a transition to more sustainable and just communities—those that phase out fossil fuels, distribute public goods fairly, allow free access to information, and waste less. In principle, computing should be able to help. But in practice, we live in a world in which opaque algorithms steer us toward misinformation and unsustainable consumerism. Insolvent shows why computing’s dominant frame of thinking is conceptually insufficient to address our current challenges, and why computing continues to incur societal debts it cannot pay back. Christoph Becker shows how we can reorient design perspectives in computer science to better align with the values of sustainability and justice.Becker positions the role of information technology and computing in environmental sustainability, social justice, and the intersection of the two, and explains why designing IT for just sustainability is both technically and ethically challenging . Becker goes on to argue that computing could be aided by critical friends—disciplines that draw on critical social theory, feminist thought, and systems thinking—to make better sense of its role in society. Finally, Becker demonstrates that it is possible to fuse critical perspectives with work in computer science, showing new and fruitful directions for computing professionals and researchers to pursue.
A ground-breaking study on how natural disasters can escalate or defuse wars, insurgencies, and other strife.Armed conflict and natural disasters have plagued the twenty-first century. Not since the end of World War II has the number of armed conflicts been higher. At the same time, natural disasters have increased in frequency and intensity over the past two decades, their impacts worsened by climate change, urbanization, and persistent social and economic inequalities. Providing the first comprehensive analysis of the interplay between natural disasters and armed conflict, Catastrophes, Confrontations, and Constraints explores the extent to which disasters facilitate the escalation or abatement of armed conflicts—as well as the ways and contexts in which combatants exploit these catastrophes.Tobias Ide utilizes both qualitative insights and quantitative data to explain the link between disasters and the (de-)escalation of armed conflict and presents over thirty case studies of earthquakes, droughts, floods, and storms in Africa, the Middle East, Asia, and Latin America. He also examines the impact of COVID-19 on armed conflicts in Iraq, Afghanistan, Nigeria, and the Philippines.Catastrophes, Confrontations, and Constraints is an invaluable addition to current debates on climate change, environmental stress, and security. Professionals and students will greatly appreciate the wealth of timely data it provides for their own investigations.
What global shifts in markets and power mean for the politics and governance of sustainability.In recent years, major shifts in global markets from North to South have created a new geography of trade and consumption, particularly in the agricultural sector. How this shift affects the governance of sustainability, and thus the future of the planet, is the pressing topic Philip Schleifer takes up in this book. The processes of twenty-first-century globalization are fundamentally changing the politics and governance of commodity production, Schleifer argues, with profound implications for the environment in the food-producing countries of the Global South. At the center of Schleifer's study are Brazil and Indonesia—two key sites of experimentation in new models of global environmental and commodity governance—where palm oil and soy supply chains have seen unprecedented degrees of private environmental governance in recent years. However, instead of transforming these industries, the diffusion of transnational sustainability standards has accompanied a worsening ecological crisis, with mounting evidence of increasingly strong links between deforestation and globalization in twenty-first-century agricultural trade. To uncover the causes of this governance failure, Schleifer develops a multi-level framework for analyzing how contemporary globalization is reconfiguring the political economies of such industries. The result is the first comprehensive analysis of the shift of global agricultural trade to the South and the deepening crisis of commodity-driven deforestation—and a complex and evolving picture of both the risks and opportunities for sustainability presented by this transformative shift.
Essays on the challenges and risks of designing algorithms and platforms for children, with an emphasis on algorithmic justice, learning, and equity.One in three Internet users worldwide is a child, and what children see and experience online is increasingly shaped by algorithms. Though children’s rights and protections are at the center of debates on digital privacy, safety, and Internet governance, the dominant online platforms have not been constructed with the needs and interests of children in mind. The editors of this volume, Mizuko Ito, Remy Cross, Karthik Dinakar, and Candice Odgers, focus on understanding diverse children’s evolving relationships with algorithms, digital data, and platforms and offer guidance on how stakeholders can shape these relationships in ways that support children’s agency and protect them from harm. This book includes essays reporting original research on educational programs in AI relational robots and Scratch programming, on children’s views on digital privacy and artificial intelligence, and on discourses around educational technologies. Shorter opinion pieces add the perspectives of an instructional designer, a social worker, and parents. The contributing social, behavioral, and computer scientists represent perspectives and contexts that span education, commercial tech platforms, and home settings. They analyze problems and offer solutions that elevate the voices and agency of parents and children. Their essays also build on recent research examining how social media, digital games, and learning technologies reflect and reinforce unequal childhoods.Contributors:Paulo Blikstein, Izidoro Blikstein, Marion Boulicault, Cynthia Breazeal, Michelle Ciccone, Sayamindu Dasgupta, Devin Dillon, Stefania Druga, Jacqueline M. Kory-Westlund, Aviv Y. Landau, Benjamin Mako Hill, Adriana Manago, Siva Mathiyazhagan, Maureen Mauk, Stephanie Nguyen, W. Ian O’Byrne, Kathleen A. Paciga, Milo Phillips-Brown, Michael Preston, Stephanie M. Reich, Nicholas D. Santer, Allison Stark, Elizabeth Stevens, Kristen Turner, Desmond Upton Patton, Veena Vasudevan, Jason Yip
The art of mashup music, its roots in parody, and its social and legal implications.Parody needn’t recognize copyright—but does an algorithm recognize parody? The ever-increasing popularity of remix culture and mashup music, where parody is invariably at play, presents a conundrum for internet platforms, with their extensive automatic, algorithmic policing of content. Taking a wide-ranging look at mashup music—the creative and technical considerations that go into making it; the experience of play, humor, enlightenment, and beauty it affords; and the social and legal issues it presents—Parody in the Age of Remix offers a pointed critique of how society balances the act of regulating art with the act of preserving it. In several jurisdictions, national and international, parody is exempted from copyright laws. And mashups should be understood as a form of parody, Ragnhild Brøvig-Hanssen contends, and thus protected from removal from hosting platforms. Nonetheless, current copyright-related content-moderation regimes, relying on algorithmic detection and automated decision making, frequently eliminate what might otherwise be deemed gray-area content—to the detriment of human listeners and, especially, artists. Given the inaccuracy of takedowns, Parody in the Age of Remix makes a persuasive argument for greater protection for remix creativity in the future—but it also suggests that the content moderation challenges facing mashup producers and other remixers are symptomatic of larger societal issues concerning positional power, the privatization of the law, and the unjust regulation of culture.
How Bulgaria transformed the computer industry behind the Iron Curtain—and the consequences of that transformation for a society that dreamt of a brighter future.Bulgaria in 1963 was a communist country led by a centralized party trying to navigate a multinational Cold War. The state needed money, and it sought prestige. By cultivating a burgeoning computer industry, Bulgaria achieved both but at great cost to the established order. In Balkan Cyberia, Victor Petrov elevates a deeply researched, local story of ambition into an essential history of global innovation, ideological conflict, and exchange. Granted tremendous freedom by the Politburo and backed by a concerted state secret intelligence effort, a new, privileged class of technical intellectuals and managers rose to prominence in Bulgaria in the 1960s. Plugged in to transnational business and professional networks, they strove to realize the party’s radical dreams of utopian automation, and Bulgaria would come to manufacture up to half of the Eastern Bloc’s electronics. Yet, as Petrov shows, the export-oriented nature of the industry also led to the disruption of party rule. Technicians, now thinking with and through computers, began to recast the dominant intellectual discourse within a framework of reform, while technocratic managers translated their newfound political clout into economic power that served them well before and after the revolutions of 1989.Balkan Cyberia reveals the extension of economic and political networks of influence far past the reputed fall of communism, along with the pivotal role small countries played in geopolitical games at the time. Through the prism of the Bulgarian computer industry, the true nature of the socialist international economy, and indeed the links between capitalism and communism, emerge.
How modern notions of architectural style were born—and the debates they sparked in nineteenth-century Germany.The term style has fallen spectacularly out of fashion in architectural circles. Once a conceptual key to understanding architecture’s inner workings, today style seems to be associated with superficiality, formalism, and obsolete periodization. But how did style—once defined by German sociologist Georg Simmel as a place where one is “no longer alone”—in architecture actually work? How was it used and what did it mean? In Style and Solitude, Mari Hvattum seeks to understand the apparent death of style, returning to its birthplace in the late eighteenth century, and charting how it grew to influence modern architectural discourse and practice. As Hvattum explains, German thinkers of the eighteenth and nineteenth century offered competing ideas of what style was and how it should be applied in architecture. From Karl Friedrich Schinkel’s thoughtful eclecticism to King Maximilian II’s attempt to capture the zeitgeist in an architectural competition, style was at the center of fascinating experiments and furious disputes. Starting with Johann Joachim Winckelmann’s invention of the period style and ending a century later with Gottfried Semper’s generative theory of style, Hvattum explores critical debates that are still ongoing today.
Literature and neuroscience come together to illuminate the human experience of beauty, which unfolds in time.How does beauty exist in time? This is Gabrielle Starr’s central concern in Just in Time as she explores the experience of beauty not as an abstraction, but as the result of psychological and neurological processes in which time is central. Starr shows that aesthetic experience has temporal scale. Starr, a literary scholar and pioneer in the field and method of neuroaesthetics, which seeks the neurological basis of aesthetic experience, applies this methodology to the study of beauty in literature, considering such authors as Rita Dove, Gerard Manley Hopkins, Henry James, Toni Morrison, and Wallace Stevens, as well as the artists Dawoud Bey and Jasper Johns.Just in Time is richly informed by the methods and findings of neuroscientists, whose instruments let them investigate encounters with art down to the millisecond, but Starr goes beyond the laboratory to explore engagements with art that unfold over durations experiments cannot accommodate. In neuroaesthetics, Starr shows us, the techniques of the empirical sciences and humanistic interpretation support and complement one another. To understand the temporal quality of aesthetic experience we need both cognitive and phenomenological approaches, and this book moves boldly toward their synthesis.
Ideologically opposed, technologically cooperative—an original account of US and USSR industrialization between the world wars.Between 1927 and 1945, a tide of hyperindustrialization washed over the United States and the Soviet Union. While the two countries remained ideologically opposed, the factories that amassed in Stalingrad, Moscow, Detroit, Buffalo, and Cleveland were strikingly similar, as were the new forms of modern work and urban and infrastructural development that supported this industrialization. Drawing on previously unknown archival materials and photographs, the essays in Detroit-Moscow-Detroit document a stunning two-way transfer of technical knowledge between the United States and the USSR that greatly influenced the built environment in both countries, upgrading each to major industrial power by the start of the Second World War. The innovative research presented here explores spatial development, manufacturing, mass production, and organizational planning across geopolitical lines to demonstrate that capitalist and communist built environments in the twentieth century were not diametrically opposed and were, on certain sites, coproduced in a period of intense technical exchange between the two world wars. A fresh account of the effects of industrialization and globalization on US and Soviet cultures, architecture, and urban history, Detroit-Moscow-Detroit will find wide readership among architects, urban designers, and scholars of architectural, urban, and twentieth-century history. Contributors: Richard Anderson, Robert Bird, Oksana Chabanyuk, Jean-Louis Cohen, Christina E. Crawford, Robert Fishman, Christina Kiaer, Evgeniia Konysheva, Mark G. Meerovich, Sonia Melnikova-Raich, Lewis H. Siegelbaum, Maria C. Taylor, Claire Zimmerman, Katherine Zubovich.
How modern architectural language was invented to communicate with the divine—challenging a common narrative of European architectural history.The architectural drawing might seem to be a quintessentially modern form, and indeed many histories of the genre begin in the early modern period with Italian Renaissance architects such as Alberti. Yet the Middle Ages also had a remarkably sophisticated way of drawing and writing about architecture. God’s Own Language takes us to twelfth-century Paris, where a Scottish monk named Richard of Saint Victor, along with his mentor Hugh, developed an innovative visual and textual architectural language. In the process, he devised techniques and terms that we still use today, from sectional elevations to the word “plan.”Surprisingly, however, Richard’s detailed drawings appeared not in an architectural treatise but in a widely circulated set of biblical commentaries. Seeing architecture as a way of communicating with the divine, Richard drew plans and elevations for such biblical constructions as Noah’s ark and the temple envisioned by the prophet Ezekiel. Interpreting Richard and Hugh’s drawings and writings within the context of the thriving theological and intellectual cultures of medieval Paris, Karl Kinsella argues that the popularity of these works suggests that, centuries before the Renaissance, there was a large circle of readers with a highly developed understanding of geometry and the visual language of architecture.
An accessible introduction to constructing and interpreting Bayesian models of perceptual decision-making and action.Many forms of perception and action can be mathematically modeled as probabilistic—or Bayesian—inference, a method used to draw conclusions from uncertain evidence. According to these models, the human mind behaves like a capable data scientist or crime scene investigator when dealing with noisy and ambiguous data. This textbook provides an approachable introduction to constructing and reasoning with probabilistic models of perceptual decision-making and action. Featuring extensive examples and illustrations, Bayesian Models of Perception and Action is the first textbook to teach this widely used computational framework to beginners.Introduces Bayesian models of perception and action, which are central to cognitive science and neuroscienceBeginner-friendly pedagogy includes intuitive examples, daily life illustrations, and gradual progression of complex conceptsBroad appeal for students across psychology, neuroscience, cognitive science, linguistics, and mathematicsWritten by leaders in the field of computational approaches to mind and brain
The first comprehensive textbook on regression modeling for linguistic data offers an incisive conceptual overview along with worked examples that teach practical skills for realistic data analysis.In the first comprehensive textbook on regression modeling for linguistic data in a frequentist framework, Morgan Sonderegger provides graduate students and researchers with an incisive conceptual overview along with worked examples that teach practical skills for realistic data analysis. The book features extensive treatment of mixed-effects regression models, the most widely used statistical method for analyzing linguistic data. Sonderegger begins with preliminaries to regression modeling: assumptions, inferential statistics, hypothesis testing, power, and other errors. He then covers regression models for non-clustered data: linear regression, model selection and validation, logistic regression, and applied topics such as contrast coding and nonlinear effects. The last three chapters discuss regression models for clustered data: linear and logistic mixed-effects models as well as model predictions, convergence, and model selection. The book’s focused scope and practical emphasis will equip readers to implement these methods and understand how they are used in current work.The only advanced discussion of modeling for linguistsUses R throughout, in practical examples using real datasetsExtensive treatment of mixed-effects regression modelsContains detailed, clear guidance on reporting modelsEqual emphasis on observational data and data from controlled experimentsSuitable for graduate students and researchers with computational interests across linguistics and cognitive science
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.