A Paradigm for Decentralized Process Modeling presents a novel approach to decentralized process modeling that combines both trends and suggests a paradigm for decentralized PCEs, supporting concerted efforts among geographically-dispersed teams - each local individual or team with its own autonomous process - with emphasis on flexible control over the degree of collaboration versus autonomy provided. A key guideline in this approach is to supply abstraction mechanisms whereby pre-existing processes (or workflows) can be encapsulated and retain security of...
A Paradigm for Decentralized Process Modeling presents a novel approach to decentralized process modeling that combines both trends and sugg...
Cooperating Heterogeneous Systems provides an in-depth introduction to the issues and techniques surrounding the integration and control of diverse and independent software components. Organizations increasingly rely upon diverse computer systems to perform a variety of knowledge-based tasks. This presents technical issues of interoperability and integration, as well as philosophical issues of how cooperation and interaction between computational entities is to be realized. Cooperating systems are systems that work together towards a common end. The concepts of cooperation must be...
Cooperating Heterogeneous Systems provides an in-depth introduction to the issues and techniques surrounding the integration and control of d...
In the last few years, the world of information networks has undergone significant changes that will revolutionize the future of communications. Data rates have reached the gigabit per second range. Optical fibers have become the transmission medium of choice. Standardization activities have very aggressively produced a set of well established standard for future LANs, MANs and WANs. It has become very difficult for computer and communications professionals to follow these rapidly evolving technologies and standards. High Performance Networks: Technology and Protocols provides...
In the last few years, the world of information networks has undergone significant changes that will revolutionize the future of communications. Data ...
This is a milestone in machine-assisted microprocessor verification. Gordon 20] and Hunt 32] led the way with their verifications of sim ple designs, Cohn 12, 13] followed this with the verification of parts of the VIPER microprocessor. This work illustrates how much these, and other, pioneers achieved in developing tractable models, scalable tools, and a robust methodology. A condensed review of previous re search, emphasising the behavioural model underlying this style of verification is followed by a careful, and remarkably readable, ac count of the SECD architecture, its formalisation,...
This is a milestone in machine-assisted microprocessor verification. Gordon 20] and Hunt 32] led the way with their verifications of sim ple designs...
Machine Learning is one of the oldest and most intriguing areas of Ar tificial Intelligence. From the moment that computer visionaries first began to conceive the potential for general-purpose symbolic computa tion, the concept of a machine that could learn by itself has been an ever present goal. Today, although there have been many implemented com puter programs that can be said to learn, we are still far from achieving the lofty visions of self-organizing automata that spring to mind when we think of machine learning. We have established some base camps and scaled some of the foothills of...
Machine Learning is one of the oldest and most intriguing areas of Ar tificial Intelligence. From the moment that computer visionaries first began to ...
What follows is a sampler of work in knowledge acquisition. It comprises three technical papers and six guest editorials. The technical papers give an in-depth look at some of the important issues and current approaches in knowledge acquisition. The editorials were pro- duced by authors who were basically invited to sound off. I've tried to group and order the contributions somewhat coherently. The following annotations emphasize the connections among the separate pieces. Buchanan's editorial starts on the theme of "Can machine learning offer anything to expert systems?" He emphasizes the...
What follows is a sampler of work in knowledge acquisition. It comprises three technical papers and six guest editorials. The technical papers give an...
One of the currently most active research areas within Artificial Intelligence is the field of Machine Learning. which involves the study and development of computational models of learning processes. A major goal of research in this field is to build computers capable of improving their performance with practice and of acquiring knowledge on their own. The intent of this book is to provide a snapshot of this field through a broad. representative set of easily assimilated short papers. As such. this book is intended to complement the two volumes of Machine Learning: An Artificial Intelligence...
One of the currently most active research areas within Artificial Intelligence is the field of Machine Learning. which involves the study and developm...
This volume contains a selection of papers that focus on the state-of- the-art in formal specification and verification of real-time computing systems. Preliminary versions of these papers were presented at a workshop on the foundations of real-time computing sponsored by the Office of Naval Research in October, 1990 in Washington, D. C. A companion volume by the title Foundations of Real-Time Computing: Scheduling and Resource Management complements this hook by addressing many of the recently devised techniques and approaches for scheduling tasks and managing resources in real-time systems....
This volume contains a selection of papers that focus on the state-of- the-art in formal specification and verification of real-time computing systems...
One of the most intriguing questions about the new computer technology that has appeared over the past few decades is whether we humans will ever be able to make computers learn. As is painfully obvious to even the most casual computer user, most current computers do not. Yet if we could devise learning techniques that enable computers to routinely improve their performance through experience, the impact would be enormous. The result would be an explosion of new computer applications that would suddenly become economically feasible (e. g., personalized computer assistants that automatically...
One of the most intriguing questions about the new computer technology that has appeared over the past few decades is whether we humans will ever be a...
The history of this book begins way back in 1982. At that time a research proposal was filed with the Dutch Foundation for Fundamental Research on Matter concerning research to model defects in the layer structure of integrated circuits. It was projected that the results may be useful for yield estimates, fault statistics and for the design of fault tolerant structures. The reviewers were not in favor of this proposal and it disappeared in the drawers. Shortly afterwards some microelectronics industries realized that their survival may depend on a better integration between technology-and...
The history of this book begins way back in 1982. At that time a research proposal was filed with the Dutch Foundation for Fundamental Research on Mat...