After long years of work that have seen little industrial application, high-level synthesis is finally on the verge of becoming a practical tool. The state of high-level synthesis today is similar to the state of logic synthesis ten years ago. At present, logic-synthesis tools are widely used in digital system design. In the future, high-level synthesis will play a key role in mastering design complexity and in truly exploiting the potential of ASIes and PLDs, which demand extremely short design cycles. Work on high-level synthesis began over twenty years ago. Since substantial progress has...
After long years of work that have seen little industrial application, high-level synthesis is finally on the verge of becoming a practical tool. The ...
This volume contains a selection of papers that focus on the state-of the-art in real-time scheduling and resource management. Preliminary versions of these papers were presented at a workshop on the foundations of real-time computing sponsored by the Office of Naval Research in October, 1990 in Washington, D.C. A companion volume by the title Foundations of Real-Time Computing: Fonnal Specifications and Methods complements this book by addressing many of the most advanced approaches currently being investigated in the arena of formal specification and verification of real-time systems....
This volume contains a selection of papers that focus on the state-of the-art in real-time scheduling and resource management. Preliminary versions of...
Some twenty years have elapsed since the first attempts at planning were made by researchers in artificial intelligence. These early programs concentrated on the development of plans for the solution of puzzles or toy problems, like the rearrangement of stacks of blocks. These early programs provided the foundation for the work described in this book, the automatic generation of plans for industrial assembly. As one reads about the complex and sophisticated planners in the current gen eration, it is important to keep in mind that they are addressing real-world problems. Although these systems...
Some twenty years have elapsed since the first attempts at planning were made by researchers in artificial intelligence. These early programs concentr...
Since the late 1960s, there has been a revolution in robots and industrial automation, from the design of robots with no computing or sensorycapabilities (first-generation), to the design of robots with limited computational power and feedback capabilities (second-generation), and the design of intelligent robots (third-generation), which possess diverse sensing and decision making capabilities. The development of the theory of intelligent machines has been developed in parallel to the advances in robot design. This theory is the natural outcome of research and development in...
Since the late 1960s, there has been a revolution in robots and industrial automation, from the design of robots with no computing or sensorycapabilit...
Efficient Dynamic Simulation of Robotic Mechanisms presents computationally efficient algorithms for the dynamic simulation of closed-chain robotic systems. In particular, the simulation of single closed chains and simple closed-chain mechanisms is investigated in detail. Single closed chains are common in many applications, including industrial assembly operations, hazardous remediation, and space exploration. Simple closed-chain mechanisms include such familiar configurations as multiple manipulators moving a common load, dexterous hands, and multi-legged vehicles. The efficient...
Efficient Dynamic Simulation of Robotic Mechanisms presents computationally efficient algorithms for the dynamic simulation of closed-chain r...
Although there is a burgeoning interest among economists in information economics', much of the literature adopts a reductionist conceptualization of information, defining it exclusively as reduction in uncertainty, exploring the implications of imperfect information on markets. This neoclassical treatment obscures major interrelations between economic and communicatory processes. Drawing on a range of distinguished scholarship from both the economic and communication studies disciplines, Information and Communicationin Economics explores the implications for economic...
Although there is a burgeoning interest among economists in information economics', much of the literature adopts a reductionist conceptualization of ...
The quest for higher performance digital systems for applications such as gen- eral purpose computing, signal/image processing, and telecommunications and an increasing cost consciousness have led to a major thrust for high speed VLSI systems implemented in inexpensive and widely available technologies such as CMOS. This monograph, based on the first author's doctoral dissertation, con- centrates on the technique of wave pipelining as one method toward achieving this goal. The primary focus of this monograph is to provide a coherent pre- sentation of the theory of wave pipelined operation of...
The quest for higher performance digital systems for applications such as gen- eral purpose computing, signal/image processing, and telecommunications...
This book has been written for practitioners, researchers and stu dents in the fields of parallel and distributed computing. Its objective is to provide detailed coverage of the applications of graph theoretic tech niques to the problems of matching resources and requirements in multi ple computer systems. There has been considerable research in this area over the last decade and intense work continues even as this is being written. For the practitioner, this book serves as a rich source of solution techniques for problems that are routinely encountered in the real world. Algorithms are...
This book has been written for practitioners, researchers and stu dents in the fields of parallel and distributed computing. Its objective is to provi...
This book is a revision of my Ph. D. thesis dissertation submitted to Carnegie Mellon University in 1987. It documents the research and results of the compiler technology developed for the Warp machine. Warp is a systolic array built out of custom, high-performance processors, each of which can execute up to 10 million floating-point operations per second (10 MFLOPS). Under the direction of H. T. Kung, the Warp machine matured from an academic, experimental prototype to a commercial product of General Electric. The Warp machine demonstrated that the scalable architecture of high-peiformance,...
This book is a revision of my Ph. D. thesis dissertation submitted to Carnegie Mellon University in 1987. It documents the research and results of the...
Only two decades ago most electronic circuits were designed with a slide-rule, and the designs were verified using breadboard techniques. Simulation tools were a research curiosity and in general were mistrusted by most designers and test engineers. In those days the programs were not user friendly, models were inadequate, and the algorithms were not very robust. The demand for simulation tools has been driven by the increasing complexity of integrated circuits and systems, and it has been aided by the rapid decrease in the cost of com puting that has occurred over the past several decades....
Only two decades ago most electronic circuits were designed with a slide-rule, and the designs were verified using breadboard techniques. Simulation t...