We introduce a concept of Domain Theories: models of applications where these models do not refer to any computing. We illustrate signatures and some axioms of such systems as: Railways, Air Traffic, Metropolitan Transport, Sustainable Development, Financial Service Industries, Enterprise Management (Strategic, Tactical and Operational), etc. We relate Domain Theory models to Requirements Capture and to Software Development.
Dines Bjorner is UN Director and Professor of Computing Science at International Institute for Software Technology, United Nations University. He participated in research into Functional Programming (under John Backus), Relational Databases (under E.F.Codd), and Denotational Semantics and Model-oriented Software Development Techniques (VDM). He was guest professor at Univ. of Calif., Berkeley, Univ. of Copenhagen and Univ. of Kiel. He was co-founder and Scientific Director of Dansk Datamatik Center. He initiated projects on formal semantics of and full, portable compilers for CHILL and ADA, and project on RAISE software development method. As Science Advisor for Computer Resources International he initiated the LaCoS project, in which RAISE is being live-tested by 10 European Industries. He is founder and also chairman (1990--1991), of VDM (now Formal Methods) Europe, a CEC (Commission of the European Communities) group. His research & technology interests include Domain Analysis, Requirements Development, Software Architecture vs. Program Organisation, Algebraic and Model-oriented software development techniques (VDM and RAISE). He published more than 60 articles, was (co-)editor of 10 books and proceedings and (co-)author of 2 books. He is member of editorial board of 11 international journals and member of more than 50 international conference programme committees, incl. chairman of IFIP World Congress 1986. He is member of Academia Europaea (The European Academy of Science), Danish Academy for Technical Sciences, AAAI, ACM, BCS, EATCS, FME (Formal Methods Europe), IEEE CS, IPSJ, SIAM and member of IFIP TC2 Working Groups WG2.2 and WG2.3.
The denotational semantics of object-oriented programming languages (OOPL's) isused to derive the control and data dependencies that exist within programs and this information is then used to produce parallel object code by a compiler. This approach is especially suited for OOPL's because of the concurrency inherent in their semantics. A denotational semantics of an OOPL called SmallC++ is developed with some specialized operations that facilitate the automatic generation of a parallelizing compiler for SmallC++. The result is a compiler which generates code for shared or distributed memory multi- processors, thereby achieving a language implementation which realizes fully transparent parallel object-oriented programming.
Barrett R. Bryant received his B. S. degree in computer science at the University of Arkansas at Little Rock in 1979, and M.S. and Ph.D. degrees in computer science from Northwestern University in 1980 and 1983, respectively. Since 1983 he has been on the faculty of the University of Alabama at Birmingham where he is currently Associate Professor and Associate Chair in computer science. His research interests are formal semantics of programming languages, compiler design, and object-oriented technology. He is a member of ACM, the IEEE Computer Society, the Association for Computational Linguistics, and the Alabama Academy of Science. He is the Editor-in-Chief of the ACM Special Interest Group on Applied Computing (SIGAPP) Review and an ACM Lecturer.
Prakash Kumar Muthukrishnan received his B. S. in Computer Science and Engineering in Summer 1989 from the College of Engineering, Guindy, Madras, India. He received his M. S. in Computer Science from the University of Alabama at Birmingham (UAB), USA, in Fall 1992. He is currently a Ph.D. candidate at UAB. Mr. Muthukrishnan's research interests are Object-Oriented Programming Languages, Compilers and Distributed Object-Oriented Systems. Currently he is employed with Bear Creek Technologies, Birmingham as a Senior Software Engineer. He is a member of ACM and Phi Beta Delta (Honors Society for International Scholars).
The presentation focuses on a systematic method to ensure the success for process improvement as part of the strategic plan of an organisation to be and stay ahead of competitors. The entire program is driven by the business goals : CMM(1) is the roadmap and ami(2) provides guide lines to select actions to address weaknesses and difficulties of the software processes and progressively match primary goals. Experiences in various organisations and markets will support the presentation and emphasis will be put on the impact of maturity levels on working environment, communication, technology or quantitative control. Cost benefit analysis will be developed at the end based on real figures.
(1) CMM : Capability Maturity Model (SEI) (2) ami : application of metrics in industry, results on an ESPRIT project and published into "a quantitative approach to software management - the ami handbook" (Addison-Wesley, 1995)
Annie KUNTZMANN-COMBELLES is Executive Vice-President of 0BJECTIF TECHNOLOGIE, a company she set up in 1989 with the main strategical targets of assessing/mastering and improving software process. Annie Kuntzmann-Combelles is largely involved in software process assessment and improvement based on business goals. She is also reviewer for the European Union Information Technology programme and has been chairing the IEEE Advisory Board over 1994 and 1995, she was Vice Chair of the COMPSAC 96 conference and member of the 97 Quality Week Program Committee (San Francisco, May 97 and ICAST Conference in April 97). Her main interest is SPI (she is actively participating to the ISO/SPICE project), measurement and object-oriented requirements and design. She has been giving dozen of seminars and training on SPI and quantitative software management based on the ami method. She is graduated from Ecole Nationale Supérieure de l'Aéronautique et de l'Espace in 1973.
The presentation introduces first the agent technology, providing an overview of key concepts, technologies and areas of application. In a second part focus is put on mobile agents, by introducing the major architectural, protocol and programming language components. Relevant standardisation activities including OMG and FIPA are described. The last part of the presentation is devoted to application areas of the agent technology in the telecommunications domain.
Dr. Stefan Covaci received his Dipl.-Ing. diploma in 1979 from the faculty of Electronics & Telecommunications at the Polytechnical Institute of Bucharest, Roma nia, and his PhD in Electrical Engineering in 1987. He is a senior researcher a t the GMD Research Institute for Open Communication Systems (FOKUS) Berlin, director of the "Intelligent Mobile Agent - Competence Center" and manager of the "Network Management and Global Connectivity Services" group. Presently he is project leader of several projects with Deutsche Telekom / De Te Berkom. Dr. Covaci is head of the German National Laboratory (Xuser) in the pan-European TMN Laboratory, and head of the German National Host in the ACTS-MISA testbed. His research interests include European / Global Information Infrastructure (EII / GII), B-ISDN (ATM, SDH) and Gigabit networking technologies, OSI/TMN integrated network management, ODP and TINA, Intelligent and Mobile Agents in Telecommunications Management. He is active within ACTS and EURESCOM research programs and in the standardisation of Intelligent and Mobile Agents (OMG, FIPA). Dr Covaci is associate professor in the Department of Open Communications Systems (OKS) at the Technical University of Berlin, where he gives lectures and conducts seminars and studies in the areas of B-ISDN/ATM, Lightwave Communications and Open Distributed Telecommunications Management. Dr. Covaci is author of 3 patents and of over 70 papers, and has been member of the organising and program commitee of several international conferences and symposia. He is a member of the New York Academy of Science and of the ACM and IEEE societies.
Over the past 10+ years, object-oriented database systems have gone a long way from research prototypes to commercial products to real-life mission-critical applications. Currently, we also witness the extension of relational systems with salient object features, resulting in so-called object-relational DBMS. This talk introduces and reviews the salient features of both approaches, discusses their merits and shortcomings, and for which kinds of applications they are best suited. It also points out where further improvements of the current state of the art are needed. Furthermore, we will speculate about several upcoming areas of database technology in a broad sense (like global information systems, workflow management, component technology) where object-orientation in general and oo/orDBMS in particular might (and should!) take a leading role.
Klaus R. Dittrich is a full professor for informatics at the University of Zurich/ Switzerland where he heads off the database technology research group. Prior affiliations include Forschungs-zentrum Informatik at the University of Karlsruhe/Germany and IBM Almaden Research Center,San Jose/USA. He has also been a visiting researcher at Hewlett Packard Laboratories and at Stanford University. His current areas of interest comprise object-oriented and active database systems, DBMS architecture, and application of advanced database technology where he is engaged in various projects, partly in cooperation with industry. He has published and lectured extensively all over the world, mainly in the areas of database systems and software engineering. In particular, he has followed the development of object-oriented and extended relational database technology from its very beginning, and has influenced it with own contributions. He is a consultant for various companies and government agencies in Switzerland, Germany and beyond. For the period 1992-1998, he has been elected to the board of trustees of the VLDB endowment. He has been co-chair of VLDB 1995 In Zurich and European PC Chair of VLDB 1997. He is a member of the board of the Swiss Informatics Society and area editor of "Information Systems".
The talks will look at the evolution of theory and practice in interactionally rich systems in recent years. Attention will focus on progress made in developing our understanding of such systems and our ability to model and analyse interactive systems. Topic covered will include the interactor approaches developed at the University of York, UK, and CNR-CNUCE Italy. The talks will conclude with a discussion of some recent work in which the speaker is engaged called syndetic modelling, which is attempting to combine models of system and user within a common mathematical framework to facilitate reasoning about the conjoint behaviour of user and system.
Prof. David Duce received his BSc degree in Chemistry from the University of Nottingham in 1971 and his PhD for a thesis entitled Infrared and Raman Spectra of Organometallic Compounds in 1975. He joined the Atlas Computer Laboratory of the UK Science and Engineering Research Council in 1974, initially as a systems programmer developing a utilities system for the ICL 1906A mainframe. He then became involved in benchmarking of interactive systems as part of a procurement exercise and started working in computer graphics in 1975. From 1979 to 1984 he was first Technical Secretary and then Academic Coordinator of SERC's Distributed Computing Systems Specially Promoted Programme. He is now a Senior Staff researcher at the UK Engineering and Physical Sciences Research Council's Rutherford Appleton Laboratory. He was one of the editors of the ISO/IEC standard for computer graphics, the Graphical Kernel System (GKS), a revision of which (GKS-94) was published in 1994. He is co-author of books on GKS, PHIGS and Graphics Standards. He is involved in the development of standards for computer graphics, including Presentation Environments for Multi-media Objects (PREMO). He is currently engaged in research projects concerned with the formal specification of graphics and interactive systems and the interface between psychology and interactive systems. He is the co-ordinator of the ERCIM Computer Graphics Network, funded under the CEC Human Capital and Mobility Programme. He is also an honourary professor of the University of East Anglia.
In the first part of talk there will be given an overview of the history of computer chess and focus on some software and hardware milestones which arose from computer chess programs since the early beginnings in the 60's. Especially, the developement of heart and brain of chess programs will be described: the lookahead algorithm (search) and the static evaluation function (knowledge). Many problems arise from the combination of both but until now it is still the most successful approach to computer chess. The second part of talk will focus on the parallel search algorithm used in the chess program Zugzwang (by Rainer Feldmann and Peter Mysliwietz). The problems of search overhead, communication overhead, and load balancing are discussed. There will be presented tradeoffs between each of these efficiency losses as well as the solutions that were implemented in the search algorithm of Zugzwang.
Rainer Feldmann studied Computer Science and Mathematics at the University of Paderborn. His master thesis in 1988 was about parallel Alpha-Beta search. In the following years he developed the chess program Zugzwang which uses an improved version of the parallel Alpha-Beta search. From this, in 1993 a phd thesis titled "Game Tree Search on Massively Parallel Systems" arose which won the award of the International Computer Chess Association for the best publication in the field of computer chess in 1993/1994. Rainer Feldmann is now a member of the research group of Prof. Monien at the University of Paderborn.
The talk will begin by an overview of configuration-based programming systems. A major feature of these systems is that they allow to describe a distributed application in terms of the interconnection of heterogeneous software components at the level of their interfaces. As a result, configuration-based programming fosters software re-use, evolution, analysis and management. Then will be described the Aster configuration-based system that is being developed at INRIA/IRISA and that aims at facilitating the construction of applications having quality of service requirements. The undertaken approach consists of building a distributed runtime system that is customized to the application's needs according to the quality of service requirements that are specified in the application's description.
Valerie Issarny is research scientist at INRIA. She received her PhD from Universite de Rennes I in 1991. In 1992, she held a postdoctoral position at the University of Washington (Seattle, WA, USA). Since 1993, she is a member of the Solidor research group at INRIA-IRISA (Rennes, France). She currently leads a project on the design and implementation of a configuration-based programming system. She also participates to projects that investigate the design of distributed operating systems for multimedia applications.
The talk will focus on the releationship between the organizational and technical issues of telecooperation and workflow. After a short introduction it will first deliver a definition of telecooperation and then describe various scenarios for telecooperation from a usage perspective. The talk will then focus on technologies for Telecooperation and specificall describe architectures for the process aspect of Telecooperation and for the material aspect of Telecooperation, thus disucussion the connection of telecooperation and workflow systems. Using some telecooperation technologies it will present an Outlook for future developments.
Prof. Dr. Helmut A.O Krcmar holds a Chair for Information Systems at the Institute for Business Administration, University Hohenheim, Stuttgart, Germany since 1987. He received his MBA degree and his Ph.d. in business administration from the University of Saarbrücken. At the Institute for Information Systems, University Saarbrücken he worked as researcher and management consultant, as a Post Doctoral Fellow at the IBM Los Angeles Scientific Center, Los Angeles, USA, as Assistant Professor of Information Systems at the Leonard Stern Graduate School of Business, New York University, New York, and as Assistant Professor of Information Systems at the Baruch College, City University of New York. His research interests include the areas of Computer Supported Cooperative Work, esp. in meetings and tele-cooperation, information management, esp. the design of new organizational forms, international aspects of business processes and control issues in DP, and decision support for environmental issues in management, esp. life cycle anaylses and integrated logistics systems. He has published articles in computer and management journals, including Wirtschaftsinformatik, information management, Journal of Information Systems and Zeitschrift für Betriebswirtschaft. He is an associate editor of MISQ. He was the 1996 Distinguished Visiting Fellow of the Australian Computer Society and Chairman of the European Conference on Information Systems (ECIS), 1996.
In Geographic Information Systems, a terrain or surface in three-dimensional space is often represented by a triangulation. A survey will be given of various algorithms that produce or operate on such terrains. In particular, the computation of a terrain based on the Delaunay triangulation from sample point sets and from gridded data will be discussed. Two algorithms will be treated in more detail. Firstly, the structure of a terrain is represented partly by the contour tree. An efficient algorithm for its construction will be given, along with its application to specific problems. Secondly, compression and decompression for storage and transmission of terrain data will be discussed.
Mark van Kreveld finished PhD in 1992, on data structures in computational geometry. He did postdoc at McGill University during a year (1992/1993) Since 1993 he is lecturer at Utrecht University. Fields of expertise include computational geometry and geographic information systems. He is co-author of new textbook on computational geometry (to appear April or May 1997) and program committee member of the ACM Symposium on Computational Geometry, 1996.
The efficient implementation of parallel algorithms is often difficult because of the wide gap between the theoretical model the algorithms were designed for and the existing hardware which is to execute the algorithms. A framework is presented to bridge this gap using information explicitly given by the programmer. Experiments with a partial implementation of the system led to several criteria for the classification of parallel algorithms with respect to their efficient realizability. These are also of theoretical interest when searching for more adequate notions of being efficiently parallelizable or being inherently sequential.
Klaus-Joern Lange recieved both his Ph.D. 1983 and his habilitation 1986 in Hamburg. 1987 - 1994 he was associate professor in Munich. Since 1995 he is full Professor at the University of Tuebingen chairing the Theory Group at the Computer Science Departement.
Given two different (relational) representations (designs) of, say, a database, can we formulate criteria which describe rigorously the various advantages and disadvantages of the two representations? This question, although one of the the oldest questions in data modeling, has hardly been answered rigorously. Its first proposed solutions in the context of database theory were formulated when only functional dependencies were considered. Various normal forms for a relation specified by such dependencies were introduced, with Boyce-Codd normal form the best motivated, altough not always obtainable while preserving all the functional dependencies originally used in the specification. But the quest for a general definition of a practically motivated normal form in the presence of more general dependencies was abandoned due to various failures. A notable exception is the introduction of ER-normal form derived from entity relationship design. Activity in design theory, which deserves this name, has come almost completely to a halt. We shall sketch recent progress in our understanding of this problem and explain its underlying difficulties.
Johann Makowsky is Associate Professor at the Computer Science Department of at Technion-Israel Institute of Technology since 1984. He received his Ph.D. in Mathematics at the ETH-Zurich in 1974. Prof. Makowsky is currently engaged in research in Mathematical Logic with its applications to various fields in Computer Science, in particular Descriptive Complexity Theory, the Foundations of Data Base Theory, Logic Programming, Software Engineering and Artificial Intelligence. He has published over 70 contributions in these fields. Work with his students range from foundational questions to the design and implementation of intelligent user interfaces.
Solving of large problems in combinatorial optimization requires use of various and non-traditional methods. These methods include the description of the problem as a LP problem and relaxation of this to effectively solvable linear or convex problems. Different heuristics and approximation algorithms are suggested to get suboptimal solutions. Combining various approaches enables us to solve quite large difficult problems. We illustrate this an two problems of practical importance, namely on the Travelling Salesman Problem and the Max-Cut Problem.
Jaroslav Nesetril, professor of Charles University, head of Deapartment of Applied Mathematics, studied in Prague, Vienna, and McMaster University. Author of more than 200 papers in combinatorics, algebra, complexity theory, graph theory and of several books (including Mathematics of Paul Erdos, Springer 1996 together with R.L.Graham). Daniel Turzik recieved his PhD in mathematics from Charles University, Prague in 1979. He was assistent at the Department of Mathematics at University of Chemistry and Technology in Prague and since 1990 he has been an associate professor at that university. Longer research stays at Mathematical Institute of Oxford University and at Institute fur Mathematik, Technische Universitat Graz. Research areas include graph theory, discrete optimization, neural networks.
In this talk we shall firstly cover the major theoretical advances in the area of distributed database systems over the last fifteen years or so. Following this, we shall highlight those developments which have made an impact in the data management industry, and have offered a driving force for many commercial products. We shall conclude with the formulation of those problems which are still open in the area, and examine how their potential solutions can enhance current technology.
Maria Orlowska is currently the Professor of Information Systems at the University of Queensland. She holds MSc (1974) and DSc (1979) degrees in Computer Science from Warsaw University, Poland. Her current research interests are: Theory of Relational Databases, Distributed Databases, Various aspects of Information Systems Design Methodologies (including Distributed Systems), Enhancement of semantic data modelling techniques by rigorous factors Transaction processing in distributed systems, concurrency control, Distributed and Federated Database Systems, Workflows Technology She has published 109 research papers in international journals and conference proceedings. Since 1992, she has been associated with the Distributed Systems Technology Centre (DSTC) in Brisbane as the Distributed Databases Unit Leader. She has served on the program committees for 44 international conferences, and was elected a Very Large Databases (VLDB) Endowment Trustee in 1994.
Motivated partly by the resurgence of neural computation research, and partly by advances in device technology, there has been a recent increase of interest in analog, continuous-time computation. However, while special-case algorithms and devices are being developed, relatively little work exists on the general theory of continuous-time models of computation. In this lecture, we survey the existing models and results in this area, and point to some of the open research questions.
Pekka Orponen received PhD in Computer Science at University of Helsinki in 1986. Since 1996 he is Professor of Applied Mathematics and Computing, University of Jyvaskyla. His reseach visits include University of California, Santa Barbara, University of Toronto, Technical University of Graz, Edinburgh University, Boston University. He published more then 20 journal articles and 30 conference papers. He is member of ACM, AMS, EATCS, IEEE.
Design patterns support the development of extensible OO software components. The talk will give an overview of state-of-the-art design pattern approaches, focusing on pattern catalogs and framework patterns. A selection of useful patterns will be discussed in detail. The talk will also discuss the implications of design patterns on the software development process. In order to exploit the potential of design patterns, flexibility requirements have to be captured in the early phases of a software project. A case study illustrates this aspect in the realm of applying design patterns.
Wolfgang Pree received his PhD in computer science from the University of Linz in 1992. He was research assistant at the University of Zurich in 1987-88, visisting assistant professor at Washington University in St. Louis, U.S.A., in 1992-93, and guest scientist at Siemens' Corporate Research Division in Munich in 1994-96. Currently he is professor of Applied Computer Science at the University of Constance, Germany. Research areas include object-oriented software development, prototyping, human-computer-interaction, and visual programming. Wolfgang Pree is the author "Design Patterns for Object-Oriented Software Development" (Addison-Wesely/ACM Press, 1995) and of "Framework Patterns" (SIGS Books, 1996).
While for countable sets there is a single well established computability theory (ordinary recursion theory), Computable Analysis is still underdeveloped. Several mutually non--equivalent theories have been proposed for it, none of which, however, has been accepted by the majority of mathematicians or computer scientists. In the talk one of these theories, TTE (Type 2 Theorie of Effectivity),is presented, which at least in the author's opinion has important advantages over the others. TTE intends to characterize and study exactly those functions, operators etc. known from Analysis, which can be realized correctly by digital computers. The talk gives a short introduction to basic concepts of TTE and shows its general applicability by some selected examples. First, Turing computability is generalized from finite to infinite sequences of symbols. Assuming that digital computers can handle (w.l.o.g.) only sequences of symbols, infinite sequences of symbols are used as names for ``infinite objects'' such as real numbers, open sets, compact sets or continuous functions. Naming systems are called representations. Since only very few representations are of interest in applications, a very fundamental principle for defining effective representations for certain topological spaces is introduced. The concepts are applied to real numbers, open sets, compact sets and continuous functions. The problem of determining zeroes is considered. Computational complexity is discussed and computability in measure theory is sketched. The talk concludes with some remarks on other models for Computable Analysis.
Klaus Weihrauch received his Diplom degree (physics) in 1970, his PhD in computer science with a dissertation on primitive recursive functions in 1973 and his habilitation in theoretical computer science with a thesis on program schemes in 1975. From 1973 he worked as a teaching assistent at Bonn university, in 1974 he visited Cornell University as research associate, from 1976 he was professor at Technische Hochschule Aachen and since 1979 he is full professor at Fernuniversitaet Hagen. He published three books (Coding Theory, 1973; Computability, 1987; Logic for Computer Science, 1990). His research areas are computability and computational complexity, domain theory and computable analysis, which has been the main area during the last 15 years.
An overview of results from machine oriented complexity theory related to brain-like, or cognitive computing, will be presented. First, a brief survey of complexity results on various kinds of finite neural networks is given. Then, subsequently, four computational models of brain are presented: Valiant's neuroidal tabula rasa, the memory surface model by Goldschlager, de Bruijn's model of a brain as a molecular computer, and Wiedermann's cogitoid. These four models are compared and their weakness as well as advantages are discussed.
Jiri Wiedermann graduated from Comenius University, Bratislava, Slovakia in 1971. In 1981 he received PhD in Informatics from Academy of Sciences, Prague. Untill 1993 he was with the Institute of Informatics and Statistics, Bratislava, last as the head of the Department of Applied Informatics. He also gave lectures at Comenius University. From 1994 he become vicedirector of the Institute of Computer Science of Academy of Sciences of the Czech Republic, Prague. He has publications especially in algorithms, data structures, machine oriented complexity theory and neurocomputing. He is also giving lectures at Charles University, Prague.
One important thread within advanced database systems research is the notion of rule-based database systems. The power of definite rules coupled with relational technology has led to the emergence of deductive databases. However, while this type of database system provides more advanced functionality, it suffers from other limitations of relational database systems such as the lack of data structures. The realisation that the object-oriented approach is complementary to the deductive one and that the two can be combined to produce deductive object-oriented databases with all the benefits of both represents an important breakthrough for rule-based database systems. An alternative to the deductive rule approach is the active rule approach. Active rules are more powerful than deductive rules but lack the advantages of the sound theoretical foundation of the latter. The two ideas can be combined to produce an active DOOD provided that the integration is treated with care.
Prof. Howard Williams obtained his PhD degree in ionospheric physics at Rhodes University in South Africa. He started lecturing in Computer Science in 1970 and was appointed as Professor of Computer Science in 1977. In 1980 he took up the post of Professor of Computer Science at Heriot-Watt University in Edinburgh. From 1980 to 1988 he was Head of Department of Computer Science. During this period he built up a research group in databases and knowledge based systems. Since 1988 he has led this group, participating in a number of EU and locally funded projects. In 1993 he was awarded a DSc degree in Computer Science. One of his major research interests over the past 15 years has been rule-based database systems. This began with Prolog databases and deductive database systems, and more recently has extended to deductive object-oriented databases and active database systems.
Computational complexity studies the intrinsic difficulty of solving mathematically posed problems. Information-based complexity is a branch of computational complexity that deals with continuous problems defined typically on spaces of multivariate functions. For such problems only approximate solutions are possible to compute. We are interested in the complexity of linear multivariate problems in various settings. In particular, the complexity depends on the error parameter e, and on the number d of variables. Many problems in science, engineering, economics and finance are modelled by multivariate problems involving functions of d variables with large d. In this talk, we survey recent results on complexity of linear multivariate problems. In particular, we show that multivariate integration and approximation are strongly tractable in the average case setting for the class of continuous functions equipped with the Wiener sheet measure.
Henryk Wozniakowski is Full Professor in Mathematics and Computer Science at University of Warsaw. Since 1984 he is Professor at Columbia University, Department of Computer Science and since 1992 he is AT&T consultant. Visiting positions include University of New south Wales, Australian National University, Carnegie-Mellon University and Moscow State University. He is co-author of 4 books, the last one "Optimal Recovery" with B.Bojanov (1992). He published more then 90 papers and he is EB member of J. of Complexity, Matematyka Stosowana (PL) Zastosowania Matematyki (PL), J. of Numerical Linear Algebra with Applications, and Numerical Algorithms. Since 1997 he is member of the Executive Committee of FoCM (Foundation of Computational Mathematics).
We present a model for theoretical studies of ATM networks. Within this model we present the problem of virtual path layout, which amounts to covering the network with simple path, under various constraints. The constraints are the hop count (the number of path traversed between the nodes that have to communicate), the load (the number of paths that share an edge or a node), and the stretch factor (the total length traversed between pairs of nodes, comparing with the distance between them). The talk describes several results for such layouts, and the techniques used include recursive constructions, greedy algorithms, lowr bounds proofs, NP-complete results, and dynamic programming. The talk is self-contained; in particular, it requires no a-priori knowledge of ATM networks.
Shmuel Zaks received his B.Sc. (71) and M.Sc. (72) in Mathematics from the Technion, Israel and his Ph.D. (79) in CS from the University of Illinois at Urbana-Champaign. He is with the department of Computer Science at the Technion. His main research is in the area of Distributed Computing. Other areas include Communication Networks (especially ATM-related models), Combinatorial and Graph Algorithms. He is on the editorial board of Information Processing Letters.
An overview of widely used description format for the Virtual Reality on the Web (VRML) is presented. Basic principles as well as useful tricks for creating dynamic virtual worlds are illustrated by number of examples. Other competitive languages for VR (QuickTimeVR, Active VRML) are compared with VRML. Possible future extensions and expectations are discussed.
Jiri Zara is the associate professor at Department of Computer Science & Engineering at CTU Prague. He is a lecturer of several courses on computer graphics (Raster Graphics Algorithms, Visualization, Graphics Systems on SGI). He works as a supervisor of Computer Graphics Laboratory at Dept. of Comp. Science & Engineering. He is a member of IEEE, ACM and a representative of the Czech Republic in ISO/IEC JTC1 - SC24 (Comp. Graphics) and SC29 (Multimedia). His research interests cover efficient algorithms for the ray tracing, medical visualization and virtual environments.
The information revolution is radically transforming lots of patterns along which society and enterprises have traditionally worked. These changes do not bring just minor technological improvements, but indeed a fundamental transformation of the industry-based society into an information-based one. They are most visible and documented within the business world, but the synergy between technological and social shifts does not stop there. In this paper we try to identify and summarize key trends and challenges which this development puts before us.
Jiri Zlatuska received his PhD in Computer Science from the Czechoslovak Academy of Sciences in 1987. He was researcher at the Institute of Computer Science of Masaryk University in 1981--87, visiting Associate Professor at the University of Delaware in 1988--89 and visiting lecturer for PhD studies in Pisa in 1989, research director of the Institute of Computer Science in 89--94, and currently Professor of Computer Science and Dean of the Faculty of Informatics of Masaryk University in Brno. In 1993 research fellow at City University in London within Copernicus, in 1996 Eisenhower Exchange Fellow in social implications of computing in the U.S. Research areas include lambda-calculus and its applications, logic programming, and digital typesetting and publishing.