Dirk Riehle, Erica Dubach
UBS AG, Ubilab
P.O. Box, CH-8098 Zurich
E-mail: {riehle, dubach}@acm.org
Abstract
A modern bank needs several attributes for successfully surviving in the market place. Two key attributes are that it flexibly reacts to change, and is capable of innovation. Much of a modern banks processes and organizational structures are reflected in software. Therefore, a bank's IT is a key player in guaranteeing these attributes. In this position paper, I discuss how maintaining these attributes influences the definition and introduction of new banking products, and how I believe that IT can improve its support through the use of dynamic object models.
Current situation
From an IT (information technology) point of view, the typical
process of developing and introducing a new product by a bank
division looks like this: Some gifted bankers or a committee has
an idea, plans it, and eventually assigns a development task to
one of its software development/maintenance units. The IT unit
then either designs and implements new software or adapts existing
systems so that they support the new or refined product.
The development of a new product varies significantly between
division and between product types. It might take from a few weeks
to several months until a new application release is available.
Different divisions use different tools, for example 4GL systems,
case tools and procedural languages, or object orientation and
frameworks.
Common to all these approaches is that
- the developers need to understand the banking domain and
their task well in order to build adequate software; and that
- the developers devote much of their time coping with the
technical complexity of their development approach and its tools.
This approach leads to a noticeable delay between the definition
of an idea and the ability of the bank to introduce the product
into the marketplace. The consequences are that
- time-to-market is possibly too long: the window of opportunity
is missed; products, in particular short-lived ones, do not pay
back the investment made into their development, etc.
- the software is inadequate: what may have been defined on
paper may not be what the bankers get, in particular, if the
product is complex.
- innovation is hindered: if it takes a few weeks or even month
until a product can be handled using a computer, no playful exploration
and what-if scenarios are possible.
Dynamic object models seek to overcome these problems by drastically
reducing time-to-market, by giving immediate feedback on what
a new application looks like and how it works, and by allowing
bankers to experiment with new product types.
What dynamic object models can do for us
It is not yet well defined what a dynamic object model is.
My take at a first definition is the following:
An object model is an abstract representation of a particular
domain, using objects as the description mechanism. A dynamic
object model is an object model whose object representation
is interpreted at runtime and can be changed with immediate (but
controlled) effect on the system interpreting it.
Obviously, a dynamic object model cannot be thought of without
a system interpreting the model. The dynamic object model is embedded
in a system, and effects its execution. Thus, dynamic object models
require adequate software support. The software must provide the
following functionality:
- a metamodel that provides the model elements from which object
models can be build;
- tools that let users define object models;
- a model engine that interprets or just-in-time-compiles-and-runs
object models;
- a well-defined connection between an object model definition
and its execution by a model engine.
Let us examine the consequences of using dynamic object models.
The primary property of using dynamic object models and its
supporting software is that changes to an object model can
be effective immediately. The model engine may immediately
execute the new model, which means, for example, that a new product
type is in place with a minimal delay.
The positive consequences of this minimal turn-around time
are:
- time-to-market is as short as possible;
- costs of developing software support for new products are
drastically reduced;
- misunderstanding between bankers and IT is reduced, as immediate
feedback is provided;
- innovation is fostered: given a safe setting, changes to
a model can be explored and played with.
The negative consequence of this approach is:
- generic tools may be sub-optimal, as they may not provide
the best handling possible.
Product-specific tools typically do a better job in supporting
the handling of a new product than generic tools do. But then,
if product-specific tools are needed (which may not be the case
for short-lived products), they can always be implemented later,
after one has learned from using the generic tools what the requirements
for these product-specific tools are.
State of the art
As the UIUC workshop on "metadata and dynamic object models"
showed, several systems of this type already exist and are in
use in industries like insurance and telecommunications billing.
In addition, personal communication with other researchers
suggests that similar efforts are going on in the R&D departments
of large enterprise software solution providers like SAP and BAAN.
All systems we have seen use a metamodel based on object concepts.
Existing approaches like knowledge-based systems, constraint systems,
or procedurally implemented systems, tend to be too far away from
providing the proper modeling elements for modeling "the
real world."
Object orientation provides a significant advantage, because
it has always been aimed at modeling domains, for example, business
domains. Other technologies can be used as a support of object
orientation. For example, constraint systems aid in implementing
business rules. Thus, an object-oriented metamodel may serve as
a core, as the common integration platform, but probably needs
to be extended with further modeling concepts.
Systems with a dynamic object model are inherently reflective
systems. As reflective systems are becoming more common, and the
involved concepts become more widely understood, we may eventually
help free reflective systems from their obscurity and give them
a (non-technical) business justification.
Model driven systems have been built previously, probably precisely
for reasons given above. The difference to these approaches is
that today, we have an improved understanding of the modeling
issues involved. In particular, we can use object orientation
as a common basis, and we now better see the business justification.
Conclusions
I conclude that for a bank it is critical to build up know-how
in this area. Whether this know-how is built up by buying and
analyzing systems, or by developing its own systems is merely
a tactical issue. What is important, is that these kinds of systems
will play a significant role in the future. They are a key to
achieving flexible reaction to change and maintaining being innovative.
Many issues surrounding dynamic object models are unclear at
this point of time. As far as I know, they have not yet been addressed
by openly accessible research. Questions, which I suggest for
the workshop to address, comprise, but are not limited to:
- Expressiveness of a metamodel
- How big should a metamodel be?
- How do a metamodel and the domains being modeled relate?
- Execution of dynamic object models
- How are the models evolved?
- How are conflicts between different model versions handled?
- Use of available technologies
- Which available technologies can be used?
- How to best integrate existing technology into a system based
on dynamic object models?
- Development approach
- How do we need to adapt the development approach?
- How do we hook-up product specific tools?
I hope that these questions will contribute to a workshop at
which we will have lots of fun and think deeply about these kinds
of systems and what they can do for us.
|