Gjør som tusenvis av andre bokelskere
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.Du kan når som helst melde deg av våre nyhetsbrev.
Breath sounds have long been important indicators of respiratory health and disease. Acoustical monitoring of respiratory sounds has been used by researchers for various diagnostic purposes. A few decades ago, physicians relied on their hearing to detect any symptomatic signs in respiratory sounds of their patients. However, with the aid of computer technology and digital signal processing techniques in recent years, breath sound analysis has drawn much attention because of its diagnostic capabilities. Computerized respiratory sound analysis can now quantify changes in lung sounds; make permanent records of the measurements made and produce graphical representations that help with the diagnosis and treatment of patients suffering from lung diseases. Digital signal processing techniques have been widely used to derive characteristics features of the lung sounds for both diagnostic and assessment of treatment purposes. Although the analytical techniques of signal processing are largely independent of the application, interpretation of their results on biological data, i.e. respiratory sounds, requires substantial understanding of the involved physiological system. This lecture series begins with an overview of the anatomy and physiology related to human respiratory system, and proceeds to advanced research in respiratory sound analysis and modeling, and their application as diagnostic aids. Although some of the used signal processing techniques have been explained briefly, the intention of this book is not to describe the analytical methods of signal processing but the application of them and how the results can be interpreted. The book is written for engineers with university level knowledge of mathematics and digital signal processing.
Heredity performs literal communication of immensely long genomes through immensely long time intervals. Genomes nevertheless incur sporadic errors referred to as mutations which have significant and often dramatic effects, after a time interval as short as a human life. How can faithfulness at a very large timescale and unfaithfulness at a very short one be conciliated? The engineering problem of literal communication has been completely solved during the second half of the XX-th century. Originating in 1948 from Claude Shannon's seminal work, information theory provided means to measure information quantities and proved that communication is possible through an unreliable channel (by means left unspecified) up to a sharp limit referred to as its capacity, beyond which communication becomes impossible. The quest for engineering means of reliable communication, named error-correcting codes, did not succeed in closely approaching capacity until 1993 when Claude Berrou and Alain Glavieuxinvented turbocodes. By now, the electronic devices which invaded our daily lives (e.g., CD, DVD, mobile phone, digital television) could not work without highly efficient error-correcting codes. Reliable communication through unreliable channels up to the limit of what is theoretically possible has become a practical reality: an outstanding achievement, however little publicized. As an engineering problem that nature solved aeons ago, heredity is relevant to information theory. The capacity of DNA is easily shown to vanish exponentially fast, which entails that error-correcting codes must be used to regenerate genomes so as to faithfully transmit the hereditary message. Moreover, assuming that such codes exist explains basic and conspicuous features of the living world, e.g., the existence of discrete species and their hierarchical taxonomy, the necessity of successive generations and even the trend of evolution towards increasingly complex beings. Providing geneticists with an introduction to information theory and error-correcting codes as necessary tools of hereditary communication is the primary goal of this book. Some biological consequences of their use are also discussed, and guesses about hypothesized genomic codes are presented. Another goal is prompting communication engineers to get interested in genetics and biology, thereby broadening their horizon far beyond the technological field, and learning from the most outstanding engineer: Nature. Table of Contents: Foreword / Introduction / A Brief Overview of Molecular Genetics / An Overview of Information Theory / More on Molecular Genetics / More on Information Theory / An Outline of Error-Correcting Codes / DNA is an Ephemeral Memory / A Toy Living World / Subsidiary Hypothesis, Nested System / Soft Codes / Biological Reality Conforms to the Hypotheses / Identification of Genomic Codes / Conclusion and Perspectives
The present book illustrates the theoretical aspects of several methodologies related to the possibility of i) enhancing the poor spatial information of the electroencephalographic (EEG) activity on the scalp and giving a measure of the electrical activity on the cortical surface. ii) estimating the directional influences between any given pair of channels in a multivariate dataset. iii) modeling the brain networks as graphs. The possible applications are discussed in three different experimental designs regarding i) the study of pathological conditions during a motor task, ii) the study of memory processes during a cognitive task iii) the study of the instantaneous dynamics throughout the evolution of a motor task in physiological conditions. The main outcome from all those studies indicates clearly that the performance of cognitive and motor tasks as well as the presence of neural diseases can affect the brain network topology. This evidence gives the power of reflecting cerebral "states" or "traits" to the mathematical indexes derived from the graph theory. In particular, the observed structural changes could critically depend on patterns of synchronization and desynchronization - i.e. the dynamic binding of neural assemblies - as also suggested by a wide range of previous electrophysiological studies. Moreover, the fact that these patterns occur at multiple frequencies support the evidence that brain functional networks contain multiple frequency channels along which information is transmitted. The graph theoretical approach represents an effective means to evaluate the functional connectivity patterns obtained from scalp EEG signals. The possibility to describe the complex brain networks sub-serving different functions in humans by means of "numbers" is a promising tool toward the generation of a better understanding of the brain functions. Table of Contents: Introduction / Brain Functional Connectivity / Graph Theory / High-Resolution EEG / Cortical Networks in Spinal Cord Injured Patients / Cortical Networks During a Lifelike Memory Task / Application to Time-varying Cortical Networks / Conclusions
E-health is closely related with networks and telecommunications when dealing with applications of collecting or transferring medical data from distant locations for performing remote medical collaborations and diagnosis. In this book we provide an overview of the fields of image and signal processing for networked and distributed e-health applications and their supporting technologies. The book is structured in 10 chapters, starting the discussion from the lower end, that of acquisition and processing of biosignals and medical images and ending in complex virtual reality systems and techniques providing more intuitive interaction in a networked medical environment. The book also discusses networked clinical decision support systems and corresponding medical standards, WWW-based applications, medical collaborative platforms, wireless networking, and the concepts of ambient intelligence and pervasive computing in electronic healthcare systems.
This lecture book is intended to be an accessible and comprehensive introduction to random signal processing with an emphasis on the real-world applications of biosignals. Although the material has been written and developed primarily for advanced undergraduate biomedical engineering students it will also be of interest to engineers and interested biomedical professionals of any discipline seeking an introduction to the field. Within education, most biomedical engineering programs are aimed to provide the knowledge required of a graduate student while undergraduate programs are geared toward designing circuits and of evaluating only the cardiac signals. Very few programs teach the processes with which to evaluate brainwave, sleep, respiratory sounds, heart valve sounds, electromyograms, electro-oculograms, or random signals acquired from the body. The primary goal of this lecture book is to help the reader understand the time and frequency domain processes which may be used and to evaluate random physiological signals. A secondary goal is to learn the evaluation of actual mammalian data without spending most the time writing software programs. This publication utilizes "e;DADiSP"e;, a digital signal processing software, from the DSP Development Corporation.
The replacement or augmentation of failing human organs with artificial devices and systems has been an important element in health care for several decades. Such devices as kidney dialysis to augment failing kidneys, artificial heart valves to replace failing human valves, cardiac pacemakers to reestablish normal cardiac rhythm, and heart assist devices to augment a weakened human heart have assisted millions of patients in the previous 50 years and offers lifesaving technology for tens of thousands of patients each year. Significant advances in these biomedical technologies have continually occurred during this period, saving numerous lives with cutting edge technologies. Each of these artificial organ systems will be described in detail in separate sections of this lecture.
The senses of human hearing and sight are often taken for granted by many individuals until they are lost or adversely affected. Millions of individuals suffer from partial or total hearing loss and millions of others have impaired vision. The technologies associated with augmenting these two human senses range from simple hearing aids to complex cochlear implants, and from (now commonplace) intraocular lenses to complex artificial corneas. The areas of human hearing and human sight will be described in detail with the associated array of technologies also described.
This is the first in a series of short books on probability theory and random processes for biomedical engineers. This text is written as an introduction to probability theory. The goal was to prepare students, engineers and scientists at all levels of background and experience for the application of this theory to a wide variety of problems-as well as pursue these topics at a more advanced level. The approach is to present a unified treatment of the subject. There are only a few key concepts involved in the basic theory of probability theory. These key concepts are all presented in the first chapter. The second chapter introduces the topic of random variables. Later chapters simply expand upon these key ideas and extend the range of application. A considerable effort has been made to develop the theory in a logical manner-developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Every effort has been made to be consistent with commonly used notation and terminology-both within the engineering community as well as the probability and statistics literature. Biomedical engineering examples are introduced throughout the text and a large number of self-study problems are available for the reader.
This is the second in a series of three short books on probability theory and random processes for biomedical engineers. This volume focuses on expectation, standard deviation, moments, and the characteristic function. In addition, conditional expectation, conditional moments and the conditional characteristic function are also discussed. Jointly distributed random variables are described, along with joint expectation, joint moments, and the joint characteristic function. Convolution is also developed. A considerable effort has been made to develop the theory in a logical manner-developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Every effort has been made to be consistent with commonly used notation and terminology-both within the engineering community as well as the probability and statistics literature. The aim is to prepare students for the application of this theory to a wide variety of problems, as well give practicing engineers and researchers a tool to pursue these topics at a more advanced level. Pertinent biomedical engineering examples are used throughout the text.
This is the third in a series of short books on probability theory and random processes for biomedical engineers. This book focuses on standard probability distributions commonly encountered in biomedical engineering. The exponential, Poisson and Gaussian distributions are introduced, as well as important approximations to the Bernoulli PMF and Gaussian CDF. Many important properties of jointly Gaussian random variables are presented. The primary subjects of the final chapter are methods for determining the probability distribution of a function of a random variable. We first evaluate the probability distribution of a function of one random variable using the CDF and then the PDF. Next, the probability distribution for a single random variable is determined from a function of two random variables using the CDF. Then, the joint probability distribution is found from a function of two random variables using the joint PDF and the CDF. The aim of all three books is as an introduction to probability theory. The audience includes students, engineers and researchers presenting applications of this theory to a wide variety of problems-as well as pursuing these topics at a more advanced level. The theory material is presented in a logical manner-developing special mathematical skills as needed. The mathematical background required of the reader is basic knowledge of differential calculus. Pertinent biomedical engineering examples are throughout the text. Drill problems, straightforward exercises designed to reinforce concepts and develop problem solution skills, follow most sections.
This short book provides basic information about bioinstrumentation and electric circuit theory. Many biomedical instruments use a transducer or sensor to convert a signal created by the body into an electric signal. Our goal here is to develop expertise in electric circuit theory applied to bioinstrumentation. We begin with a description of variables used in circuit theory, charge, current, voltage, power and energy. Next, Kirchhoff's current and voltage laws are introduced, followed by resistance, simplifications of resistive circuits and voltage and current calculations. Circuit analysis techniques are then presented, followed by inductance and capacitance, and solutions of circuits using the differential equation method. Finally, the operational amplifier and time varying signals are introduced. This lecture is written for a student or researcher or engineer who has completed the first two years of an engineering program (i.e., 3 semesters of calculus and differential equations). A considerable effort has been made to develop the theory in a logical manner-developing special mathematical skills as needed. At the end of the short book is a wide selection of problems, ranging from simple to complex.
In the past 50 years there has been an explosion of interest in the development of technologies whose end goal is to connect the human brain and/or nervous system directly to computers. Once the subject of science fiction, the technologies necessary to accomplish this goal are rapidly becoming reality. In laboratories around the globe, research is being undertaken to restore function to the physically disabled, to replace areas of the brain damaged by disease or trauma and to augment human abilities. Building neural interfaces and neuro-prosthetics relies on a diverse array of disciplines such as neuroscience, engineering, medicine and microfabrication just to name a few. This book presents a short history of neural interfacing (N.I.) research and introduces the reader to some of the current efforts to develop neural prostheses. The book is intended as an introduction for the college freshman or others wishing to learn more about the field. A resource guide is included for students along with a list of laboratories conducting N.I. research and universities with N.I. related tracks of study.Table of Contents: Neural Interfaces Past and Present / Current Neuroprosthesis Research / Conclusion / Resources for Students
This book aims to provide vital information about the growing field of bionanotechnology for undergraduate and graduate students, as well as working professionals in various fields. The fundamentals of nanotechnology are covered along with several specific bionanotechnology applications, including nanobioimaging and drug delivery which is a growing $100 billions industry. The uniqueness of the field has been brought out with unparalleled lucidity; a balance between important insight into the synthetic methods of preparing stable nano-structures and medical applications driven focus educates and informs the reader on the impact of this emerging field. Critical examination of potential threats followed by a current global outlook completes the discussion. In short, the book takes you through a journey from fundamentals to frontiers of bionanotechnology so that you can understand and make informed decisions on the impact of bionano on your career and business.
The biomedical engineering senior capstone design course is probably the most important course taken by undergraduate biomedical engineering students. It provides them with the opportunity to apply what they have learned in previous years; develop their communication (written, oral, and graphical), interpersonal (teamwork, conflict management, and negotiation), project management, and design skills; and learn about the product development process. It also provides students with an understanding of the economic, financial, legal, and regulatory aspects of the design, development, and commercialization of medical technology. The capstone design experience can change the way engineering students think about technology, society, themselves, and the world around them. It gives them a short preview of what it will be like to work as an engineer. It can make them aware of their potential to make a positive contribution to health care throughout the world and generate excitement for and pride in the engineering profession. Working on teams helps students develop an appreciation for the many ways team members, with different educational, political, ethnic, social, cultural, and religious backgrounds, look at problems. They learn to value diversity and become more willing to listen to different opinions and perspectives. Finally, they learn to value the contributions of nontechnical members of multidisciplinary project teams. Ideas for how to organize, structure, and manage a senior capstone design course for biomedical and other engineering students are presented here. These ideas will be helpful to faculty who are creating a new design course, expanding a current design program to more than the senior year, or just looking for some ideas for improving an existing course. Contents: I. Purpose, Goals, and Benefits / Why Our Students Need a Senior Capstone Design Course / Desired Learning Outcomes / Changing Student Attitudes, Perceptions, and Awarenesss / Senior Capstone Design Courses and Accreditation Board for Engineering and Technology Outcomes / II. Designing a Course to Meet Student Needs / Course Management and Required Deliverables / Projects and Project Teams / Lecture Topics / Intellectual Property Confidentiality Issues in Design Projects / III. Enhancing the Capstone Design Experience / Industry Involvement in Capstone Design Courses / Developing Business and Entrepreneurial Literacy / Providing Students with a Clinical Perspective / Service Learning Opportunities / Collaboration with Industrial Design Students / National Student Design Competitions / Organizational Support for Senior Capstone Design Courses / IV. Meeting the Changing Needs of Future Engineers / Capstone Design Courses and the Engineer of 2020
There are many books written about statistics, some brief, some detailed, some humorous, some colorful, and some quite dry. Each of these texts is designed for a specific audience. Too often, texts about statistics have been rather theoretical and intimidating for those not practicing statistical analysis on a routine basis. Thus, many engineers and scientists, who need to use statistics much more frequently than calculus or differential equations, lack sufficient knowledge of the use of statistics. The audience that is addressed in this text is the university-level biomedical engineering student who needs a bare-bones coverage of the most basic statistical analysis frequently used in biomedical engineering practice. The text introduces students to the essential vocabulary and basic concepts of probability and statistics that are required to perform the numerical summary and statistical analysis used in the biomedical field. This text is considered a starting point for important issues to consider when designing experiments, summarizing data, assuming a probability model for the data, testing hypotheses, and drawing conclusions from sampled data. A student who has completed this text should have sufficient vocabulary to read more advanced texts on statistics and further their knowledge about additional numerical analyses that are used in the biomedical engineering field but are beyond the scope of this text. This book is designed to supplement an undergraduate-level course in applied statistics, specifically in biomedical engineering. Practicing engineers who have not had formal instruction in statistics may also use this text as a simple, brief introduction to statistics used in biomedical engineering. The emphasis is on the application of statistics, the assumptions made in applying the statistical tests, the limitations of these elementary statistical methods, and the errors often committed in using statistical analysis. A number of examples from biomedical engineering research and industry practice are provided to assist the reader in understanding concepts and application. It is beneficial for the reader to have some background in the life sciences and physiology and to be familiar with basic biomedical instrumentation used in the clinical environment. Contents: Introduction / Collecting Data and Experimental Design / Data Summary and Descriptive Statistics / Assuming a Probability Model from the Sample Data / Statistical Inference / Linear Regression and Correlation Analysis / Power Analysis and Sample Size / Just the Beginning / Bibliography
Neural interfaces are one of the most exciting emerging technologies to impact bioengineering and neuroscience because they enable an alternate communication channel linking directly the nervous system with man-made devices. This book reveals the essential engineering principles and signal processing tools for deriving control commands from bioelectric signals in large ensembles of neurons. The topics featured include analysis techniques for determining neural representation, modeling in motor systems, computing with neural spikes, and hardware implementation of neural interfaces. Beginning with an exploration of the historical developments that have led to the decoding of information from neural interfaces, this book compares the theory and performance of new neural engineering approaches for BMIs. Contents: Introduction to Neural Interfaces / Foundations of Neuronal Representations / Input-Outpur BMI Models / Regularization Techniques for BMI Models / Neural Decoding Using Generative BMI Models / Adaptive Algorithms for Point Processes / BMI Systems
In the last ten years many different brain imaging devices have conveyed a lot of information about the brain functioning in different experimental conditions. In every case, the biomedical engineers, together with mathematicians, physicists and physicians are called to elaborate the signals related to the brain activity in order to extract meaningful and robust information to correlate with the external behavior of the subjects. In such attempt, different signal processing tools used in telecommunications and other field of engineering or even social sciences have been adapted and re-used in the neuroscience field. The present book would like to offer a short presentation of several methods for the estimation of the cortical connectivity of the human brain. The methods here presented are relatively simply to implement, robust and can return valuable information about the causality of the activation of the different cortical areas in humans using non invasive electroencephalographic recordings. The knowledge of such signal processing tools will enrich the arsenal of the computational methods that a engineer or a mathematician could apply in the processing of brain signals. Table of Contents: Introduction / Estimation of the Effective Connectivity from Stationary Data by Structural Equation Modeling / Estimation of the Functional Connectivity from Stationary Data by Multivariate Autoregressive Methods / Estimation of Cortical Activity by the use of Realistic Head Modeling / Application: Estimation of Connectivity from Movement-Related Potentials / Application to High-Resolution EEG Recordings in a Cognitive Task (Stroop Test) / Application to Data Related to the Intention of Limb Movements in Normal Subjects and in a Spinal Cord Injured Patient / The Instantaneous Estimation of the Time-Varying Cortical Connectivity by Adaptive Multivariate Estimators / Time-Varying Connectivity from Event-Related Potentials
The field of brain imaging is developing at a rapid pace and has greatly advanced the areas of cognitive and clinical neuroscience. The availability of neuroimaging techniques, especially magnetic resonance imaging (MRI), functional MRI (fMRI), diffusion tensor imaging (DTI) and magnetoencephalography (MEG) and magnetic source imaging (MSI) has brought about breakthroughs in neuroscience. To obtain comprehensive information about the activity of the human brain, different analytical approaches should be complemented. Thus, in "e;intermodal multimodality"e; imaging, great efforts have been made to combine the highest spatial resolution (MRI, fMRI) with the best temporal resolution (MEG or EEG). "e;Intramodal multimodality"e; imaging combines various functional MRI techniques (e.g., fMRI, DTI, and/or morphometric/volumetric analysis). The multimodal approach is conceptually based on the combination of different noninvasive functional neuroimaging tools, their registration and cointegration. In particular, the combination of imaging applications that map different functional systems is useful, such as fMRI as a technique for the localization of cortical function and DTI as a technique for mapping of white matter fiber bundles or tracts. This booklet gives an insight into the wide field of multimodal imaging with respect to concepts, data acquisition, and postprocessing. Examples for intermodal and intramodal multimodality imaging are also demonstrated. Table of Contents: Introduction / Neurological Measurement Techniques and First Steps of Postprocessing / Coordinate Transformation / Examples for Multimodal Imaging / Clinical Aspects of Multimodal Imaging / References / Biography
Take one elephant and one man to the top of a tower and simultaneously drop. Which will hit the ground first?You are a pilot of a jet fighter performing a high-speed loop. Will you pass out during the maneuver?How can you simulate being an astronaut with your feet still firmly placed on planet Earth?In the aerospace environment, human, animal, and plant physiology differs significantly from that on Earth, and this book provides reasons for some of these changes. The challenges encountered by pilots in their missions can have implications on the health and safety of not only themselves but others. Knowing the effects of hypergravity on the human body during high-speed flight led to the development of human centrifuges. We also need to better understand the physiological responses of living organisms in space. It is therefore necessary to simulate weightlessness through the use of specially adapted equipment, such as clinostats, tilt tables, and body suspension devices. Each of these ideas, and more, is addressed in this review of the physical concepts related to space flights, microgravity, and hypergravity simulations. Basic theories, such as Newton's law and Einstein's principle are explained, followed by a look at the biomedical effects of experiments performed in space life sciences institutes, universities, and space agencies. Table of Contents: General Concepts in Physics - Definition of Physical Terms / The Effects of Hypergravity on Biomedical Experiments / The Effects of Microgravity on Biomedical Experiments / References
A medical device is an apparatus that uses engineering and scientific principles to interface to physiology and diagnose or treat a disease. In this Lecture, we specifically consider thosemedical devices that are computer based, and are therefore referred to as medical instruments. Further, the medical instruments we discuss are those that incorporate system theory into their designs. We divide these types of instruments into those that provide continuous observation and those that provide a single snapshot of health information. These instruments are termed patient monitoring devices and diagnostic devices, respectively.Within this Lecture, we highlight some of the common system theory techniques that are part of the toolkit of medical device engineers in industry. These techniques include the pseudorandom binary sequence, adaptive filtering, wavelet transforms, the autoregressive moving average model with exogenous input, artificial neural networks, fuzzy models, and fuzzy control. Because the clinical usage requirements for patient monitoring and diagnostic devices are so high, system theory is the preferred substitute for heuristic, empirical processing during noise artifact minimization and classification. Table of Contents: Preface / Medical Devices / System Theory / Patient Monitoring Devices / Diagnostic Devices / Conclusion / Author Biography
This book is concerned with the study of continuum mechanics applied to biological systems, i.e., continuum biomechanics. This vast and exciting subject allows description of when a bone may fracture due to excessive loading, how blood behaves as both a solid and fluid, down to how cells respond to mechanical forces that lead to changes in their behavior, a process known as mechanotransduction. We have written for senior undergraduate students and first year graduate students in mechanical or biomedical engineering, but individuals working at biotechnology companies that deal in biomaterials or biomechanics should also find the information presented relevant and easily accessible. Table of Contents: Tensor Calculus / Kinematics of a Continuum / Stress / Elasticity / Fluids / Blood and Circulation / Viscoelasticity / Poroelasticity and Thermoelasticity / Biphasic Theory
Tremor represents one of the most common movement disorders worldwide. It affects both sexes and may occur at any age. In most cases, tremor is disabling and causes social difficulties, resulting in poorer quality of life. Tremor is now recognized as a public health issue given the aging of the population. Tremor is a complex phenomenon that has attracted the attention of scientists from various disciplines. Tremor results from dynamic interactions between multiple synaptically coupled neuronal systems and the biomechanical, physical, and electrical properties of the external effectors. There have been major advances in our understanding of tremor pathogenesis these last three decades, thanks to new imaging techniques and genetic discoveries. Moreover, significant progress in computer technologies, developments of reliable and unobtrusive wearable sensors, improvements in miniaturization, and advances in signal processing have opened new perspectives for the accurate characterization and daily monitoring of tremor. New therapies are emerging. In this book, we provide an overview of tremor from pathogenesis to therapeutic aspects. We review the definitions, the classification of the varieties of tremor, and the contribution of central versus peripheral mechanisms. Neuroanatomical, neurophysiological, neurochemical, and pharmacological topics related to tremor are pointed out. Our goals are to explain the fundamental basis of tremor generation, to show the recent technological developments, especially in instrumentation, which are reshaping research and clinical practice, and to provide up-to-date information related to emerging therapies. The integrative transdisciplinary approach has been used, combining engineering and physiological principles to diagnose, monitor, and treat tremor. Guidelines for evaluation of tremor are explained. This book has been written for biomedical engineering students, engineers, researchers, medical students, biologists, neurologists, and biomedical professionals of any discipline looking for an updated and multidisciplinary overview of tremor. It can be used for biomedical courses. Table of Contents: Introduction / Anatomical Overview of the Central and Peripheral Nervous System / Physiology of the Nervous System / Characterization of Tremor / Prinipal Disorders Associated with Tremor / Quantification of Tremor / Mechanisms of Tremor / Treatments
Quantitative Neurophysiology is supplementary text for a junior or senior level course in neuroengineering. It may also serve as an quick-start for graduate students in engineering, physics or neuroscience as well as for faculty interested in becoming familiar with the basics of quantitative neuroscience. The first chapter is a review of the structure of the neuron and anatomy of the brain. Chapters 2-6 derive the theory of active and passive membranes, electrical propagation in axons and dendrites and the dynamics of the synapse. Chapter 7 is an introduction to modeling networks of neurons and artificial neural networks. Chapter 8 and 9 address the recording and decoding of extracellular potentials. The final chapter has descriptions of a number of more advanced or new topics in neuroengineering. Throughout the text, vocabulary is introduced which will enable students to read more advanced literature and communicate with other scientists and engineers working in the neurosciences. Numerical methods are outlined so students with programming knowledge can implement the models presented in the text. Analogies are used to clarify topics and reinforce key concepts. Finally, homework and simulation problems are available at the end of each chapter. Table of Contents: Preface / Neural Anatomy / Passive Membranes / Active Membranes / Propagation / Neural Branches / Synapses / Networks of Neurons / Extracellular Recording and Stimulation / The Neural Code / Applications / Biography / Index
Lung sounds auscultation is often the first noninvasive resource for detection and discrimination of respiratory pathologies available to the physician through the use of the stethoscope. Hearing interpretation, though, was the only means of appreciation of the lung sounds diagnostic information for many decades. Nevertheless, in recent years, computerized auscultation combined with signal processing techniques has boosted the diagnostic capabilities of lung sounds. The latter were traditionally analyzed and characterized by morphological changes in the time domain using statistical measures, by spectral properties in the frequency domain using simple spectral analysis, or by nonstationary properties in a joint time-frequency domain using short-time Fourier transform. Advanced signal processing techniques, however, have emerged in the last decade, broadening the perspective in lung sounds analysis. The scope of this book is to present up-to-date signal processing techniques that have been applied to the area of lung sound analysis. It starts with a description of the nature of lung sounds and continues with the introduction of new domains in their representation, new denoising techniques, and concludes with some reflective implications, both from engineers' and physicians' perspective. Issues of nonstationarity, nonlinearity, non-Gaussianity, modeling, and classification of lung sounds are addressed with new methodologies, revealing a more realistic approach to their pragmatic nature. Advanced denoising techniques that effectively circumvent the noise presence (e.g., heart sound interference, background noise) in lung sound recordings are described, providing the physician with high-quality auscultative data. The book offers useful information both to engineers and physicians interested in bioacoustics, clearly demonstrating the current trends in lung sound analysis. Table of Contents: The Nature of Lung Sound Signals / New Domains in LS Representation / Denoising Techniques / Reflective Implications
This book provides an introduction to the principles of several of the more widely used methods in medical imaging. Intended for engineering students, it provides a final-year undergraduate- or graduate-level introduction to several imaging modalities, including MRI, ultrasound, and X-Ray CT. The emphasis of the text is on mathematical models for imaging and image reconstruction physics. Emphasis is also given to sources of imaging artefacts. Such topics are usually not addressed across the different imaging modalities in one book, and this is a notable strength of the treatment given here. Table of Contents: Introduction / Diagnostic X-Ray Imaging / X-Ray CT / Ultrasonics / Pulse-Echo Ultrasonic Imaging / Doppler Velocimetry / An Introduction to MRI
Abonner på vårt nyhetsbrev og få rabatter og inspirasjon til din neste leseopplevelse.
Ved å abonnere godtar du vår personvernerklæring.