African
Journals Online
South African Computer Journal / Die Suid-Afrikaanse Rekenaartydskrif
Issue 26, November, 2000
Abstracts
A comparison of bisimulation-based semantic equivalences
for noninterleaving behaviour over CCS processes
Galpin, V.
Abstract: A number of extensions to the process algebra
CCS (Calculus of Communicating Systems) have been proposed to
deal with noninterleaving behaviour such as location and
causality. The aim of the paper is to use existing and new
comparison results to provide a hierarchy of these semantic
equivalences over pure finite CCS terms. It is not possible to
include some extensions in this hierarchy and the reasons for the
exclusion are given.
Algebraic results for structured operational semantics
Galpin, V.
Abstract: This paper presents algebraic results that
are important for the extended tyft/tyxt format [12, 13] which
can be used to describe many different process algebras. This
format is based on a many-sorted signature which permits both
processes and labels to be treated syntactically. Existing
results for this format permit the comparison of process algebra
semantic equivalences by forming the sum of two transition system
specifications and imposing certain conditions. The results
presented in this paper involve the summing of congruences that
model the actual process algebra labels, and determine under what
conditions these congruences have important properties such as
compatibility and conservativity. The aim of this paper is to
show that the notion of sort-similarity on the sum of signatures
is sufficient for the sum of the congruences induced by each
label algebra to be the same as the congruence induced by the
summed label algebras. Additionally, sort-similarity is
sufficient for compatibility and conservativity when summing.
Finally, conditions on the label algebra are given that ensure
compatibility.
Algorithms for the shortest exact common superstring
problem
Elloumi, M
Abstract: The Shortest Exact Common Superstring
(SECS) problem is: Given a set of strings f = fw1; w2; : : :;wng,
where no wi is an exact substring of wj ,i6 = j, find a shortest
string Se such that every string of f is an exact substring of
Se. When the number of the strings n > 2, the SECS problem
becomes NP-complete. In this paper, we present an exact SECS
algorithm and a greedy approximation one. Our greedy algorithm is
a 1/2-approximation for the SECS problem. Our exact SECS
algorithm and our greedy one are, respectively, of complexities
O(n n ) and O(n 2 _ (l +log(n))) in computing time, where n is
the number of the strings and l is the maximum length of a
string. Our SECS algorithms are based on the computation of the
Length of the Exact Longest Overlap (LELO).
An expert system algorithm for the identifcation problem
De Kock, G de V
Abstract: An algorithm is proposed for an expert system
to solve the general identifcation problem which is formally
defined. The algorithm follows a statistical approach and is
based on the work of the author in [2] and Vogts in [11]. A
measure is defined to express the similarity between an observed
specimen to be identified and any one of the possible classes. It
is based on a similarity function of the common property values.
This approach solves the inaccuracies of the linearized approach
given in [2]. Using a measure of clustering, a method to select
the `best' next property to be established during the execution
of the algorithm is proposed. A prototype of the algorithm is
tested ona small knowledge base to identify South African cycads,
similar to that given in [2]. The comparison with the results in
[2]shows that the proposed algorithm is a significant
improvement.
A formal perspective to specification of transaction
systems
Ehikioya, S.A.Barker, K.
Abstract: This paper describes the relationship and
relevance of predicate logic in the specification of transaction
system protocols.We present a formal specification technique
based on concurrent execution that provides automatic
communication and synchronization mechanisms between concurrent
processes. This framework is not biased towards specific types of
trans-actions.It integrates temporal behaviour of individual
transactions with the dependencies that can arise when accessing
shareable data. Our approach allows the specification of
transaction constraints and transaction dependencies in a
declarative fashion. This provides flexibility in their
implementation and permits different techniques to enforce the
constraints independent of the application program. This paper
explores the issues of correctness, reliability, and recovery of
data distributed across the enterprise; describes techniques for
guaranteeing and enhancing correctness; and procedures for
recovering transactions and data in case of any transaction
failures to guarantee a high degree of system dependability.
A multiple perspectives evaluation of the factors affecting
software development productivity in an outsourced IT project
Petkova, O.Roode, J.D.Petkov, D.
Abstract: The paper provides a model for examining the
perceptions of the users and the outsourcing providers about what
influences software development productivity. It is illustrated
through a case study. The model is using a multicriteria decision
analysis approach - the Analytic Hierarchy Process. This type of
analysis of perceptions of the different stakeholders in an
outsourcing relationship can be applied to other organisational
environments to help improve the management of outsourced IT
activities.
An incremental construction algorithm for Venn diagrams
Eloff, J.van Zijl, L.
Abstract: The drawing of Venn diagrams have
traditionally served as an aid to analyze the characteristics
exhibited by the diagram and its underlying Venn graph. Little
attention has hitherto been given to the presentation,
construction and way in which a Venn diagram conveys information.
We present a new algorithm for the construction and layout of a
Venn diagram based on an incremental technique. The algorithm
constructs the Venn diagram by incrementally deriving new
intersections from previously placed ones on a grid. We also
propose criteria which can be applied to analyze a Venn diagram's
drawing and layout. Our algorithm aims to generate an
aesthetically pleasing layout by satisfying the proposed
criteria.
An XML based approach to enforcing history-based separation
of duty policies in heterogeneous workflow environments
Papenfus, C.Botha, R.
Abstract: In the computing world a new technology
occasionally comes along, promising to make dramatic changes to
the way computing tasks are performed. The Extensible Markup
Language (XML) has been heralded as one such technology. XML
promises to provide a universal metadata mechanism for defining,
understanding and interchanging information between possibly
heterogeneous systems. This paper exploits this powerful promise
of XML by examining how it can be used to enforce history-based
separation of duty policies across heterogeneous workflow
environments. A very brief overview of separation of duty
policies is provided, whereafter the need for history-based
separation of duty is motivated through an extensive case study.
A solution based on XML baggage is proposed and it is shown how
the solution would operate in the context of the case study.
A proposed framework for evaluating generic enterprise
models
Van Belle, J-P.Price, B.
Abstract: This paper investigates a number of ways in
which enterprise models can be compared and proposes a framework
for assessing the quality of generic enterprise models. The paper
starts with an overview of various qualitative and quantitative
yardsticks, taken from related disciplines, which can be used to
evaluate generic enterprise models. It then uses some of these to
develop a suitable framework for assessing or comparing the
quality of enterprise models. The resultant framework is
essentially an extension of Bart-Jan Hommes' framework for
analysing the quality of a business modelling technique, to
include some criteria suggested for evaluating the genericity of
enterprise reference architectures. To illustrate how the
proposed framework can be operationalised, it is then applied to
two well-known general enterprise models: the
reference models underlying two ERP solutions namely those of
Baan and SAP AG.
Computerised assessment of the appearance of abraded fabric
Fazekas, Z.Szalai, L.
Abstract: As a continuation of our earlier research on
automatic visual measurement methods of fabric quality features,
here we present an automatic visual approach towards measuring
another fabric quality feature, the feature being the durability
of the fabric's visual appearance (DVA). This is a novel fabric
quality feature, though a closely related quality feature is
sometimes used for certain types of fabrics. DVA complements the
presently frequently used fabric quality feature -namely abrasion
resistance - without imposing prohibitive extra procedures into
the standardised fabric assessment. The DVA describes the way the
fabric's appearance changes during its mechanical lifetime.
Effectively, it furnishes information on the portion of the
fabric's mechanical lifetime for which the fabric remains
visually (aesthetically) stable. In our paper we explain why the
DVA is a useful fabric quality feature for the various players in
the fabric trade and show how the measurement of a fabric's DVA
relates to the measurement of abrasion resistance. We also
present a possible implementation of the suggested assessment
approach. The DVA of a fabric sample is calculated from images
taken at different stages of abrasion testing. n the presented
implementation certain brightness, colour and morphological
features are calculated for each of these images and the acquired
feature values are used to calculate the fabric's DVA.
Cooperative learning in neural networks using particle swarm
optimisers
Van den Bergh, F.Engelbrecht, A.P.
Abstract: This paper presents a method to employ
particle swarms optimizers in a cooperative configuration. This
is achieved by splitting the input vector into several
sub-vectors, each which is optimized cooperatively in its own
swarm. The application of this technique to neural network
training is investigated, with promising results.
Diversity and effectiveness in information systems project
teams
Smith, D.C.Van Jaarsveld, QNeethling, C.Andrews, R.
Abstract: This study investigated IS team effectiveness
in relation to team diversity.Team diversity in the context of
this research was based on Belbin Rs Team Role Theory. 12 student
project teams R team role diversity were evaluated using 2
commonly used Belbin questionnaires. A third questionnaire, based
on the Francis & Young model, was used to evaluate overall
team performance levels. An analysis of the results dentified a
very strong relationship between team role diversity and team
effectiveness in IS project teams. Thus, the greater the role
diversity of the team, the better the team performance. These
conclusions support the findings of other more general research
studies and can assist those selecting IS project teams who want
to improve overall team effectiveness.
Evaluating the BPR effect of a SAP R/3 implementation in a
manufacturing environment
Calitz, A.P.Calitz, M.B.
Abstract: The introduction of ERP software in
organisations changes business processes and procedures. ERP
implementations have therefore been compared to BPR projects,
which focus on streamlining business processes. Critical success
factors have been identified for both ERP and BPR
implementations. In this study it is shown that the
implementation of ERP software has a business process
reengineering effect on the business and that the business should
therefore also focus on BPR success factors. Various attributes
are associated with reengineered processes, such as faster
throughput, empowerment of employees and elimination of
unnecessary tasks. The implementation of an ERP package, SAP R/3,
at a textile manufacturing company was effectively a
reengineering project as is proven in this study, where the
resulting processes are measured against attributes of
reengineered processes.
Feedback in human-computer interaction
characteristics and recommendations
Renaud, K.Cooper, R.
Abstract: The need for feedback is widely accepted, and
the role of feedback in making things easier for the end-user is
also no longer disputed. However, the nature of the feedback and
the manner of this provision is by no means agreed upon. Feedback
is traditionally considered to be a communication of immediate
system state to the end-user. This paper will argue the need for
extending the concept to encompass the provision of archival
feedback information about past activity. We also argue
the need for graphical, rather than textual, feedback and provide
a list of desirable feedback features which should be provided by
an application. The paper ends off by examining the means for
this provision.
Investigating the use of multiresolution image
processing and artificial neural networks for computer-aided
diagnosis within a telemedicine network
Aydn Alaylolu, B.Aghdasi, F
Abstract: In image processing and pattern
recognition, the usefulness of extracted image features and
classifiers are assessed according to their accuracy in
classifying new data, such as assigning a diagnostic decision to
digitised medical images. In such applications, predictive
classifiers are a valuable tool only if they increase the number
of true detections while minimising the occurrence of false
predictions. A computer-automated detection scheme, making use of
an artificial neural network classifier, with input feature
vectors containing spatial, spectral and multiscale image
attributes, is investigated here with an objective to improve the
efficiency of the medical diagnostic processes. A wavelet-based
image enhancement technique is also implemented, with the aim of
improving the computer-automated detection performance. The
incorporation of the CAD workstations into a telemedicine service
network and biomedical informatics systems to meet the challenges
of the new millennium is described.
Knowledge management technology: examination of information
diverse repositories
Handzic, M.Parkin, P.
Abstract: This paper reports results of an empirical
examination of the contribution of information diverse
repositories in enhancing individual knowledge and performance in
a judgemental decision making context. A laboratory experiment
was conducted using 32 graduate students as voluntary subjects.
Performance of actual subjects was compared with that of their
nominal naive and optimal counterparts. Results indicate that
actual subjects performed better than naive, but worse than
optimalnominals irrespective of the level of information
diversity present in their available repositories. The results
also indicatethat subjects tended to perform significantly worse
when faced with more diverse information. It can be concluded
from the results that, in general, knowledge workers may have
difficulties in turning available task information into
knowledgand translating it to task performance. This also
suggests that such workers may potentially benefit from a number
of other knowledge management initiatives that would enhance
their understanding of the existence and the form of
relationships among diverse information.
Motivating and recruiting intending IS professionals: a
study Of What attracts IS students to prospective employment
Turner, R.Lowry, G.
Abstract: This paper is the second study in a research
program aimed at achieving a better fit between university
courses and professional practise of information systems. The
paper reports the results of a survey of student attitudes
towards incentives and conditions of employment that may be
useful to employers who wish to attract and retain scarce and
talented information technology professionals. Students were
asked to rate the importance of eleven conditions and incentives
and to identify and rank the five incentive variables that they
considered most important to them in their decision to accept and
remain in work in a given organisation. The findings suggest that
IS students are most attracted by a friendly work environment,
supportive superiors, and perceived promotional opportunities,
with economic incentives such as salary and fringe benefits rated
as rather less important. Implications for employers who are
dealing with an ongoing global shortage of information technology
staff are discussed.
On the validation and legitimisation of an information
system-some theoretical considerations and a case study
Petkova, O.Petkov, D.
Abstract: This paper provides a discussion on the
validation and legitimisation of information systems in general.
The theory is illustrated on the case of a project about the
computerisation of the management of the research function at an
university. The paper aims to show that traditional aspects of
model validity and legitimisation in Operational Research can be
applicable to the field of Information Systems (IS). However the
validation considerations in this case sometimes need
considerable modification since a more appropriate way of viewing
complex IS projects is through an interpretivist viewpoint. Some
extensions on the notion of validation for soft systems are
provided for that purpose. The importance of both validation and
legitimisation in Information Systems is illustrated on a case
study about a failed software project.
Placing axial lines in urban grids
Sanders, I.D.
Abstract: In earlier papers the orthogonal axial line
placement problem for orthogonal rectangles and the
non-orthogonal axial line placement problem in orthogonal
rectangles were presented. These problems were shown to be
NP-Complete by transformations from the vertex cover problem for
planar graphs. The general axial line placement problem
placing axial lines to cross the adjacencies between adjacent
convex polygons - is a more general case of the problem of
placing non-orthogonal axial lines in orthogonal rectangles and
the NP-Completeness proof can be extended to this problem as
well. In this paper the axial line placement problem and the
related problem of generating the convex map for polygons with
holes which can be represented as urban grids are considered. The
paper shows that for these arrangements of polygons the solutions
can be found in polynomial time. It is also conjectured that some
more general configurations could also be solved in polynomial
time.
Scheduling on a multiprocessor-based telephone switch
Liu, M.Majumdar, S.Streibel, D.Carroll, B.
Abstract: This research is concerned with reducing memory
contention through process scheduling in the call control module
of a scalable shared memory multiprocessor-based telephone
switch. Multiple processes may contend for shared variables
stored in the shared memory of the multiprocessor system. A
lock-rollback mechanism is used on the system to serialize access
to shared data and preserve data integrity. A process locks a
memory line before accessing it. If the line is already locked,
either the requesting process or the process currently holding
the memory line is rolled back. The rolled back process restarts
from the beginning at a future point in time. Rollbacks expend
work to preserve data integrity. Effective control of memory
contention is thus required for achieving high system capacity
and scalability. Based on trace-driven simulation we have
investigated the performance of three different scheduling
approaches. These include two pure and a hybrid
strategy that combines the pure strategies into a single
scheduling approach. The results of the research demonstrate that
the strategy used for scheduling can have a significant impact on
performance. The strengths of the pure strategies are described
and the tunability of a hybrid strategy for handling a wide range
of workloads corresponding to a variety of different products is
discussed.
Small group collaboration and presence in a virtual
environment
Casanueva, J.Blake, E.
Abstract: Presence in Collaborative Virtual
Environments (CVEs) can be classified into personal presence and
co-presence. Personal presence is having a feeling of being
there in the CVE oneself. Co-presence is having a feeling
that one is in the same place as the other participants, and that
one is collaborating with real people. In this paper we describe
an experiment to investigate the effects that small group
collaboration and interaction has on personal presence and
specially co-presence in a CVE. We hypothesise that collaboration
and interaction enhances co-presence in a CVE. We found that
there was a large difference in co-presence between two CVEs
which produced different levels of collaboration and interaction.
This supports our hypotheses that just having virtual
representations of others is not sufficient to create a high
sense of co-presence, and that one needs collaboration and
interaction in order to enhance co-presence in a CVE. We also
found that. We measured personal presence subjectively, using a
questionnaire developed by Slater et al. We have developed a
co-presence questionnaire which assesses the levels of
co-presence subjectively. We have also developed a collaboration
questionnaire which measures group collaboration subjectively, as
well as the degree of enjoyment and comfort with others in the
group.
Structuring knowledge acquisition in software development
projects
Trimble, J.A.
Abstract: The goal of this research is to provide
knowledge acquisition approaches that will contribute to reducing
the risk involved in software development. Structured knowledge
acquisition techniques can increase the rate of obtaining and
developing knowledge, and the reliability of the knowledge
acquired. This project builds on knowledge acquisition research
from the expert systems and project management sub-disciplines.
Knowledge representation is examined as a component of the
knowledge acquisition process. This paper is intended to provide
background material and guidelines that will assist in structured
interaction with stakeholders in the software development
process.
Using the Taylor expansion of multilayer feedforward neural
networks
Engelbrecht, A.P.
Abstract: The Taylor series expansion of continuous
functions has shown - in many fields - to be an extremely
powerful tool to study the characteristics of such functions.
This paper illustrates the power of the Taylor series expansion
of multilayer feedforward neural networks. The paper shows how
these expansions can be used to investigate positions of decision
boundaries, to develop active learning strategies and to perform
architecture selection.
Website literacy requirements: readability and occulacy
measurement development
Licker, P.Bailey, D.Scott, A.Stamper, B.
Abstract: This paper explores the concept of
literacy requirements for the web and in doing so
examines a number of different ideas about literacy. The concept
of readability is one important component of effective literacy
with regard to websites. The research reported on here applies
several existing and newly derived measures of readability to a
selection of websites. Subsequent analysis is used to draw
conclusions about the validity and applicability of these
measures to determining website readability. A number of literacy
measures were found to be appropriate for determining the
readability of web page content. Two specific measures are the
most significant in relationship to actual reading and navigating
through online content. Further research into web literacy is
proposed.
A graphical environment for the facilitation of logic-based
security protocol analysis
Saul, E.Hutchison, A.C.M.
Abstract: The development of cryptographic logics to
analyze security protocols has provided one technique for
ensuring the correctness of security protocols. However, it is
commonly acknowledged that analysis using a modal logic such as
GNY tends to be inaccessible and obscure for the uninitiated. In
this paper we describe a graphical tree-based specification
environment which can be used to easily construct GNY statements
using contextualized pop-up menus. The interface which we
describe helps to move logic-based analysis out of the world of
academia and into the mainstream market.
Analysing algorithms using computed values
Mueller, C.
Abstract: The purpose of this short paper is to give
some insights as to how computed values can aid algorithm design.
A simple algorithm of the finding average is expressed in terms
computed values illustrating how one can express an algorithm in
a paradigm independent form. This same problem is used to show
how a refinement process enables an algorithm to be developed
from a specification. The refinement process enables alternatives
to be explored at each step of the design. By looking at the
number of computed values and their dependencies provides some
simple tools for analysing the choices, this enables decisions to
be made as to which alternative to select. Some simple theory is
developed for describing the computed values. The advantage of
the theory is that it is based on commonly used mathematical
notation and restricted form predicate logic. Hence anyone with
some basic mathematical background is able to understand the
semantics of the theory.
An object oriented approach to parser generation in C++
Cosgrave, L.Power, J.Waldron, J.
Abstract: In this paper we describe the design and
implementation of a system for representing context-free grammars
in C ++ . The system allows for grammar representation at the
object level, providing enhanced modularity and flexibility when
compared to traditional generator-based approaches. We also
describe the transformation of grammar flow analysis problems
into an object-oriented framework using the Visitor pattern, as
well as the implementation of a top-down LL(1) parser. As such,
this work represents the synthesis of three presently disparate
fields in parser design and implementation: combinator parsing,
fixpoint-based grammar flow analysis, and object-oriented design.
Conflict analysis as a means of enforcing static separation
of duty requirements in workflow environments
Perelson, S.Botha, R.A.
Abstract: The increasing reliance on information
technology to support business processes has emphasised the need
for information security mechanisms. This, however, has resulted
in an ever-increasing workload in terms of security
administration. Policy-based approaches have been proposed,
promising to lighten the workload of security administrators.
Separation of duty is one of the principles cited as a
requirement when setting up these policy-based mechanisms.
Different types of separation of duty policies exist. They can be
categorised into policies that can be enforced at administration
time, viz. static separation of duty requirements and policies
that can be enforced only at execution time, viz. dynamic
separation of duty requirements. This paper deals with specifying
static separation of duty requirements in role-based workflow
environments. It proposes a mathematical model based on the
concept of Sconflicting entitiesT to express static
separation of duty requirements. It provides a detailed
explanation of the integrity checking that must take place at
administration time to ensure that specified separation of duty
requirements are honoured.
Defining and specifying graphs using formal model-based
techniques
Kotze, P.
Abstract: Graph theory is an established field of
study. The concepts of graphs and transition networks are
well-known in computing. Mathematical expressions of some kind
are almost always used to define graphs. Although these
definitions are generally considered to be exact, one runs into
difficulty when attempting to specify various graph definitions
using a model-based specification notation such as Z, or when
implementation structures are considered. In order to
successfully do so one has to change the general mathematical
definitions of graphs. This paper provides a set of such
alternative definitions based on the use of bag structures.
Digital imaging challenges for artificial human vision
Boyle, J.Maeder, A.Boles, W.W.
Abstract: Current research to develop artificial human
vision relies on the inducing of digital image-like effects in
the brain. This paper reviews related projects underway
internationally, and then discusses aspects of image processing
which could provide enhanced visual information to users of such
systems, where image quality is severely restricted. Some more
advanced options involving computed pseudo-scenes providing
virtual understanding of natural visual scenes are proposed.
Fast tracking students from disadvantaged backgrounds into
main stream Computer Science
Blignaut, R.J.Venter, I.M.Cranfield, D.J.
Abstract: A computer-based training (CBT) system was
used to teach Computer Literacy to full-time students at the
University of the Western Cape. This approach was successful in
creating computer literate students as well as creating an
opportunity for students from educationally disadvantaged schools
to enter the Computer Science course. The students experienced
this new approach to learning positively. This has laid the
foundation to export the computer-based education model to
communities outside the university. Lifelong learning
opportunities will thus be created.
Handling diversity in group Work in the information systems
class
Thomas, T.De Villiers, C.
Abstract: Tertiary institutions in South Africa are
faced with dealing with diversity in all its forms in our
classrooms. Information Systems, Information Technology and
Computer Science students need to learn to work with people who
are different from themselves in order to learn to work
effectively in the work environment to which they will go.
Teaching students in a multicultural classroom to be able to
practise their profession in multicultural settings is crucial.
This paper looks at the problems that occur if we ignore
diversity, some techniques for dealing with diversity, especially
when using group work and then presents results of a series of
four case studies where some of these techniques were applied.
Holistic programming environments
Marsden, G.Thimbleby, H.
Abstract: As a result of the popularity of graphical user
interfaces, it is now almost impossible to buy a programming
language compiler instead, one purchases a development
environment. Of course, we can scoff at the distinction and say
that a development environment is nothing more than a programming
language with visual (as opposed to syntactic) sugar. We believe,
however, that this view must change if safer and more responsible
programming languages are to be created for the next generation
of programmer. Within this paper, we will argue that a more
theoretical approach should be taken to the development of
programming environments and suggest ways in which this may be
achieved.
Organising educational resources on an Intranet
Min, Y.S.Thomas, T.
Abstract: Technology influences education in a number
of ways. One of these influences is in the classroom, creating
what can be called a computer supported learning environment. A
computer supported learning environment has many tools available
for the enhancing of the learning experience. Some of these tools
are educational resources tempered with technology. These
resources include interactive tutorials, animations and
simulations. It is important for an educator to store these
resources in such a way as to make them accessible to both
students and fellow educators. In storing the resources, there
are several factors that have to be remembered. These factors
include the reusability of materials, the linking of materials to
courses and the navigation through the course sites.
Presence as a means for understanding user behaviour in
virtual environments
Blake, E.Casanueva, J.Nunez, D.
Abstract: Presence has become a key concept in
characterizing and evaluating Virtual Environments. Our
contribution is to show that current measures of Presence, as a
metric of users' experience of Virtual Environments, are highly
problematic: results from the literature cannot be repeated and
it lacks a theoretical basis. We synthesize results from three
experiments we conducted and in conclusion point the way to
alternative approaches to the problem of characterizing
Collaborative Virtual Environments.
Virtual multicasting as an example of information mass
transit
Machanick, P.Andrew, B.
Abstract: Virtual Multicasting (VMC) is a specific
instance of a more general idea, Information Mass Transit (IMT).
IMT aims to reduce the waste of bandwidth resulting from
individual streams of data, while improving user-level latency.
By analogy with mass transit where shared transport reduces the
load on infrastructure, IMT aims to use networks and other
infrastructure more efficiently. VMC combines some of the
benefits of caching (transparency, dynamic adaptation to
workload) and multicast (reducing duplicated traffic).
|