1998 - Year 2000 (Y2K) problem coverage
continues in this issue (Previous issue is at
http://www.unescap.org/stat/gc/gcnl/gcnl10.asp). In April 1998, high-level delegations at
the ESCAP Commission discussed the Y2K problem
and urged member and associate members to react
without delay. In June 1998, the General Assembly
adopted a resolution on the Y2K problem and
will discuss it again at its 53rd
session statrting in September. At a more technical
level, ESCAP and the Statistical Institute for
Asia and the Pacific (SIAP) organized a workshop
on the Y2K problem for national statistical
offices (NSOs), from 18 to 19 June 1998.
Y2K awareness level seems to have improved
significantly in many statistical offices, but
the overall preparedness is still very low.
While not downplaying the importance of continued
awareness creation, the SIAP/ESCAP Workshop
emphasized the need for practical advice that
could help individual organizations to kick-start
their Y2K work. Thus it chose to address its
recommendations directly to NSO managers, a
group that can make Y2K compliance a reality
in those offices. A good mix of statisticians,
IT people, and managers in the Workshop processed
the recommendations in a format applicable to
all government agencies and private enterprises.
As a follow-up to the Workshop, the secretariat
has sent the recommendations and resource papers
to national statistical offices in the region,
with a request that they be shared with other
government departments. They are also available
from at ESCAP Web site http://www.unescap.org/stat/gc/escapy2k.asp.
Although patching of old Y2K codes is rightly
a dominating IT topic, many new developments
are taking place, and many of them are very
significant for developing countries. In this
issue, we take a look at some significant advancements
in wireless telecommunications and the impending
breakthrough of electronic commerce. In the
supply driven IT markets, governments should
monitor such key trends. By making right policy
choices early, they can accelerate national
IT development, improve business prospects in
the private sector, and enhance efficiency within
the civil service.
Y2K problem awareness increasing but action
still lags behind
Government agencies and private enterprises
are very likely to experience an unpleasant
surprise if they continue to ignore addressing
the year 2000 problem in computers. A Workshop,
organized from 18 to 19 June in Bangkok by the
United Nations ESCAP and the Statistical Institute
for Asia and the Pacific, demonstrated that
for national statistical offices (NSOs), at
least, the level of preparedness is nowhere
near where it should be by now. A common
misconception in developing countries is that
they are not affected because of their low level
Building on decisions taken earlier at ESCAP's
intergovernmental forums, the Workshop adopted
a set of action-oriented recommendations designed
to help national statistical offices kick-start
their Y2K work. As there are many commonalities
between statistical offices and other government
agencies, the Workshop felt that the recommendations
were relevant for all organizations in developing
The Workshop emphasized that chief executives
are responsible for Y2K problem resolution.
It is not possible to outsource that responsibility,
neither is it possible to delegate the responsibility
to IT departments.
In many of the national statistical offices
represented at the Workshop, no concrete work
has been done to identify the existence or impact
of the Y2K problem. That is alarming in view
of the fact that many of them are running custom-made
COBOL applications on
mainframe platforms, a combination that is certain
to be affected by the Y2K problem. Those systems
are used to process and store results of statistical
surveys, population censuses, and economic and
social indicators. Inaction could effectively
paralyse such statistical offices. The new decade
is challenging enough for them even without
Y2K problems, as most are preparing to organize
for the year 2000 round of censuses and other
significant statistical undertakings.
Today, delays in providing key statistical
indicators -- such as might result from Y2K
problems -- are very likely to cause uncertainty
and abrupt reactions in financial markets, for
instance. Think only about how closely price
movements or changes in employment figures or
trade flows are monitored by the media, and
how rapidly the markets are reacting to those
data. Or consider national accounts, which are
built up from a very large number of other statistics,
and then used to adjust national policies.
Of the 23 national statistical offices represented
in the Workshop, the situation is considered
to be satisfactory only in Australia, Hong Kong
China, New Zealand and Singapore, but even they
could not be certain if everything could be
finished in time. The two largest countries,
China and India, admitted that they faced a
colossal task to make their large and decentralized
statistical systems Y2K compliant. Although
they are fortunate to have large pools of computer
professionals, much work remains to be done.
The road to compliance is long also in a number
of developing countries that have started computerization
of statistical operations relatively early and
in a mainframe environment. That group includes
Brunei Darussalam, Fiji, Indonesia, Islamic
Republic of Iran, Malaysia, Myanmar, Philippines,
Republic of Korea, Sri Lanka, Thailand. Another
group of countries (Lao People's Democratic
Republic, Maldives, Mongolia, Nepal, and Viet
Nam) have based their statistical systems on
PC-technology, with Windows NT or Unix servers
supporting LANs. Their main problems relate
to inadequate replacement budgets of non-compliant
PCs and LAN equipment and software.
Recession-hit countries are having particular
difficulties in finding resources for Y2K projects.
For instance, the National Statistical Office
of Thailand is well aware of the problem and
is in the process of making its applications
compliant within its existing budget. However,
its already squeezed budget does not permit
the replacement of the operating systems of
the two mainframe computers, forcing the NSO
to experiment with ad hoc solutions, such as
turning the computer clocks back to the year
The Workshop cautioned that replacement of
applications is in most cases no longer a viable
alternative because of the time required for
designing, programming, implementation, testing,
and above all, for migrating the data and the
whole statistical process to the new system.
The Workshop recognized that it was crucial
to raise the Y2K awareness level within governments.
However, awareness campaigns in the media and
intergovernmental forums are being converted
into action by individual organizations only
very slowly. The achievement of Y2K compliance
is possible only by combining bottom-up and
top-down approaches. On the one hand, it is
possible to do a lot of preparatory work without
separate funding in terms of awareness creation,
making an inventory of possibly affected items,
and in seeking compliance information from vendors.
On the other hand, finance ministries, funding
authorities, and CEOs in public sector organizations
have to understand that the magnitude of the
task is such that it cannot be managed without
additional human and financial resources.
The following is an abbreviated set of recommendations
made by the Workshop, addressed to NSO managers.
Y2K action list
|Stop waiting for somebody
to come to study and fix Y2K problems. That
is not going to happen. Accept ownership
of systems and full responsibility for them.
Lack of technological expertise is not an
acceptable excuse for inaction.
|Appoint immediately a
full-time Y2K coordinator, with managerial
skills and necessary authority to initiate
actions and delegate responsibilities
|Backup data and systems
securely before doing any Y2K compliance
testing. Ensure that the backup data can
be read. Document backups properly.
|Use all talents within
the organization to create multidisciplinary
teams to undertake step-wise rectification.
Include a mix of management staff, IT staff,
and substantive experts. Locate them together
and relieve them from other responsibilities
to the extent possible.
|Ensure that anything new
developed or purchased is Y2K compliant.
|Use industry standard
steps in achieving Y2K compliance, but simplify
where possible. Information is available
on the Internet, literature, and IT magazines.
|Include embedded chips
in the Y2K inventory and seek out compliance
information for them.
|Start conversion of mission-critical
systems in priority order.
|Remember that, although
necessary, awareness campaigns or lengthy
planning processes will not resolve Y2K
problems. Start practical work immediately.
|Do not wait for funding
before starting. A lot of preparatory work
can be done without separate funding, especially
in awareness creation, in preparation of
an inventory of affected items, and in seeking
compliance information from vendors
|Do not rely on general
compliance statements. Be specific when
requesting compliance information from equipment
and software suppliers. Ask when compliant
replacements will become available, and
how they will be installed and operationalized.
|Demonstrate the impact
of non-compliant components and systems
to management by writing down what would
happen if each affected system was not available.
|Transmit those technology
and business assessments to top management
and to the authority providing the budget.
|Enlist the support of
important clients to strengthen Y2K funding
|Concentrate on resolving
only the Y2K problem itself. Do not attempt
to simultaneously improve the functionality
or other features of existing systems
|Slow down or postpone
new IT development to conserve resources.
|Remember that Y2K projects
carry pitfalls of typical IT projects, including
overly optimistic scheduling, poor documentation
and incomplete debugging. Projects have
a tendency to drag on longer than initially
|Make a contingency plan
at an early stage.
|Include a worst-case scenario
in the contingency plan.
|Document all Y2K efforts
from the beginning. That written evidence
may be invaluable later.
|Do not let top executives
delegate or outsource their responsibility
(it is not possible). Y2K compliance is
a major business issue, the alternative
is often a closure of operations.
|Disseminate these recommendations
to all staff in the organization.
- The tenth session of the
Working Group of Statistical Experts in November
1997 was the first ESCAP meeting to consider
the implications of the year 2000 problem
in the region. It made recommendations to
the governments and national statistical offices.
- In April 1998 the ESCAP
Commission, the body's highest decision making
authority, urged governments to tackle the
year 2000 problem as a priority issue.
- The Workshop on the Year
2000 Problem in Computers and Strategic Issues
for National Statistical Offices was organized
together with the Tokyo-based Statistical
Institute for Asia and the Pacific from 18
to 19 June 1998. It reviewed the status of
Y2K work in national statistical offices and
made the recommendations quoted in this Newsletter.
- The ESCAP secretariat,
while not having operational resources to
assist members and associate members in a
more tangible way, such as by providing advisory
services or responding to individual technological
questions, has been creating awareness through
the Government Computerization Newsletter;
a Y2K special issue was published in December
1997, (http://www.unescap.org/stat/gc/gcnl/gcnl10.asp), and the coverage continues with this issue.
- The Committee on Statistics
will review Y2K progress in national statistical
offices again at its 11th session,
from 24-26 November 1998.
Readers may ask
why ESCAP's Y2K coverage is so "statistical".
The main reason is the need to use scarce, and
already committed resources efficiently. The
latest Workshop for instance, was possible only
by pooling the resources of SIAP and ESCAP.
Another reason is that the Committee on Statistics
oversees the secretariat's public sector computerization
activities, including those related to the Y2K
World Bank Y2K grants for developing countries
The Information for Development (infoDev) Program
of the World Bank is accepting applications
from national governments for grants for assistance
with the Year 2000 Problem. The grants have
been made possible due to a contribution to
infoDev from the United Kingdom Department for
Grants will be of two
Planning Grants to support
the development/ improvement of National Action
Plans to deal with the Year 2000 Problem. Maximum
size for a planning grant will be US$100,000.
infoDev anticipates that there will be at least
40 such grants. Given the urgency of the Year
2000 problem, work should be completed within
six months of the award. In order to qualify
for such a grant, a national government must
assign a Year 2000 focal point responsible for
implementing the Planning Grant. Counterpart
resources are encouraged.
Implementation Grants to support
remediation, testing and evaluation of targeted
systems, ideally conducted under a National
Year 2000 Action Plan. Maximum size for an implementation
grant will be US$500,000. infoDev expects that
there will be at least 20 such grants. Work
should be completed by the Year 2000. The National
Government Proposal should designate an Implementing
Agency for an Implementation Grant.
In order to qualify for such a grant, the National
Government must demonstrate that complementary
resources are available for implementation of
a Year 2000 Program from sources including the
national government, non-governmental organizations,
academic institutions, the private sector, multilateral
institutions, and/or other donors. Such counterpart
resources must be available in the following
- two dollars to every infoDev
dollar for countries eligible for International
Development Association funding (a listing
of such countries is available from the World
Bank web site: http://www.worldbank.org/aspl/extdr/eligible.asp);
- five dollars to every
infoDev dollar for lower-middle income counties
not eligible for International Development
Association funding (i.e. non-IDA eligible
countries with per capita GNP under US$3,115
- ten dollars to every infoDev
dollar for upper-middle income countries (i.e.
countries with per capita GNP from US$3,115
to US$9,636 in 1996).
Source: infoDev Web site, http://www.worldbank.org/infodev/y2kguide.asp, which contains application forms and more
details. You may also contact firstname.lastname@example.org,
tel. (1-202) 458-5153 fax: (1-202) 522-3186
for more information.
All are exposed to Y2K problem - some more than
Exposure to the year 2000 problem depends on
the existence of software, hardware and embedded
systems that use two-digit dates. The following
gives indications about the kind of organizations
that have low and high exposure to the year
reduce (but do not eliminate) Y2K problem
of high Y2K problem exposure
|Low rate of computerization
||IT intensive working
|IT development started
in full force less than five years ago
history, application development started
in the 1980s or earlier
|Low rate of electronic
technology in buildings
||Modern and convenient
||Use of mainframe computers,
client-server technology in general (studies
show that most new mainframes run old
|Use of latest operating
||Old operating systems;
use of several operating systems
|Use of latest off-the-shelf
|New PCs and LAN equipment
||Large stock of old PCs
|Low dependence on telecommunications
||Business based on telecommmunications
|Low rate of electronic
transactions with collaborators and clients
of data and information
|Systems are of recent
development, and developed with latest
||Applications are old
and poorly documented
||Staff responsible for
development of systems are no longer with
||Haphazard and poorly
documented data backup strategies and
Falling in the left column does not mean that
the organization could not encounter year 2000
problems. The chances for the occurrence of
the year 2000 problem are many. To be able to
say with a high probability that the component
is not affected by dates, all of the following
criteria should be met (taken from http://www.iee.org.uk/2000risk/updates/update02.asp):
- There is no real-time clock
- Timers do not use a difference
in dates to calculate a time.
- There is no battery backing
for the processor or memory. This point should
not be taken as sufficient evidence in itself
as some chips maintain information (and potentially
therefore date information) even though there
is no visible means of power.
- There is no access to internal
or external non-volatile devices, such as
disks and tapes.
- The date is not available
(known) at the Operating System layer.
- The application language
contains no constructs or libraries that use
or manipulate dates.
- There is no external date
interface from clocks or over communications
- The operator never sets
a date in the system.
- The operator never sees
any dates (on input or output).
If the last three conditions (g, h, i, the
detection of which do not require deeper technological
knowledge) are true and if the system concerned
has also low potential impact on the operations
of the organization, the risk from that system
may be deemed low, and further investigations
could be given a low priority. However, as the
year 2000 problem is often invisible (especially
in embedded systems), systems whose failures
have potentially greater impact would need to
be assigned a high testing priority.
Year 2000 conformity means that neither the
performance nor the functionality is affected
by dates prior to, during and after the year
2000. Full conformity means that
- No value for (current)
date will cause any interruption in operation.
- Date-based functionality
must behave consistently for dates prior to,
during and after the year 2000.
- In all interfaces and
data storage, the century in any date must
be specified either explicitly or by unambiguous
algorithms or inferencing rules.
- Year 2000 must be recognised
as a leap year (29/2/2000 and 366 days in
the year 2000).
Source: British Standards Institution Committee
BDD/1/-/3, DISC PD2000-1: A definition of year
2000 conformity requirements, http://www.iee.org.uk/2000risk/guide/year2k98.asp). The site of the Institution of Electrical
Engineers is an excellent source for information
about the Y2K problem, especially in embedded
systems, see http://www.iee.org.uk/2000risk/.
High-level Y2K concerns
The ESCAP Commission expressed, at its fifty-fourth
session in April 1998, a deep concern about
the predicted disruptions that the year 2000
problem in computers and embedded chips was
likely to cause at the national, regional and
global levels. Noting the slow start made by
many countries of the region in tackling the
problem, it urged all governments to make its
resolution a high priority. The Commission recognized
that the problem affected infrastructure services
such as electricity supply and telecommunications,
as well as banking and other systems. The Commission
emphasized that it was the responsibility of
top level management to initiate organization-wide
action to address the issue. For identification
and resolution of the problem, the Commission
recommended the use of multidisciplinary teams
that periodically reported on progress to high-level
As an immediate measure, the Commission recommended
that organizations demand guarantees from suppliers
that all new software and equipment were year
2000 compliant. The Commission advised all organizations
to make contingency plans in case of failure
of their own systems or of external or foreign
systems that they were increasingly dependent
on. Given the urgency of the situation, the
impending high work volume in fixing existing
systems meant that mission-critical applications
had to be given the highest priority. The Commission
warned that any delays were likely to increase
the modification cost and make the timely resolution
of the problem very difficult, as the required
skills were already in short supply.
The General Assembly adopted on 26 June 1998
a resolution on macroeconomic policy questions,
addressing the global implications of the year
2000 conversion problem for computers. By the
text, the Secretary-General was asked to develop
a plan to make the United Nations system "year
2000 compliant", and to monitor related funding
sources for developing countries and those with
economies in transition. A topic "Global implications
of the year 2000 date conversion problem of
computers" will be included in the agenda for
the Assembly's fifty-third session, which begins
in mid-September 1998.
Internet connectivity update
Since our last report in June 1997,
- American Samoa (AS) is
up with a full Internet connection
- Cook Islands (CK) has
started with a provisional full Internet connection
- Kiribati (KI) has announced
it will start Internet service in July 1998
- Myanmar (MM) has joined
with a UUCP link
- Viet Nam (VN) has been
upgraded from UUCP to full Internet
During the past one year, the number of ESCAP's
regional members and associate members without
full Internet connection shrank by four, to
11 out of 56. The next table shows the overall
connectivity situation in the region. It must
be emphasized that the list tells very little
about the connectivity of government departments
or civil servants, who still very seldom have
even a basic e-mail connection at their disposal.
connectivity of ESCAP members and associate
|ISO 3166 code
||Country or area
||Democratic People's Republic
||Hong Kong, China
||Islamic Republic of Iran
||Lao People's Democratic
||Northern Mariana Islands
||Papua New Guinea
||Republic of Korea
connection may be a UUCP or FIDOnet network
P Provisional connection
Data sources: International E-mail
accessibility FAQ; © Olivier M.J.
Crepin-Leblond, release 97.02.07 of 3
July 1997, available at http://www.ee.ic.ac.uk/misc/country-codes.html
and Global Web Explorer -- Linking with
every country throughout the World; ©
Robert S. Duggan & Steven H. Gibbs;
http://www.guernsey.net/~sgibbs/), supplemented by ESCAP secretariat information.
sector computerization Web site
At the end of March 1998, the secretariat launched
dedicated Public sector computerization pages
on its Web site. As our readers know, public
sector computerization is a small-scale secretariat
activity promoting the use of modern information
technology in member and associate member governments.
From the home page http://www.unescap.org/stat/gc/pschome.asp, Internet users can conveniently link to the
Government Computerization Newsletter and
to a collection of reference links (http://www.unescap.org/stat/gc/psclinks.asp).
The links are related to information technology
in the public sector and to information technology
management in general. National links to government
IT sites are provided to encourage intercountry
contacts and exchange of information among the
target audience, (which is the same as the target
readership of the Newsletter):
- Persons creating national
and organization wide IT policies and strategies,
- ESCAP member and associate
member governments, i.e. the Asia and Pacific
region. They are typically
- senior civil servants
- analysts studying the
use of information technology in the public
Please visit the Public sector computerization
site and draw it to the attention of colleagues
and associates. We would be grateful for hints
of suitable links to add to the collection.
Also, please do not hesitate to point out outdated
links. That would be useful as the pages are
maintained, at least for the time being, with
minimal resources and are updated on an ad hoc
basis (as and when time allows).
Trend watch: Two technologies maturing remarkably
Information technology evolution continues
with remarkable and accelerating speed. However,
the reality for many developing countries is
not as bright as a quick look at the presented
Internet connectivity table might suggest. Few
developing countries are able to utilize IT
effectively yet, in fact many keep falling further
behind the industrialized countries. But let
us choose a positive angle and examine two areas
of information and communication technology
that have advanced significantly during the
past year, namely wireless telecommunication
and electronic commerce.
Developing countries could benefit greatly
from these two technologies if they choose to
do so (and are given the right advice and support).
Their main hurdle in applying these and other
new technologies is of course their low purchasing
Multinational vendors are currently still doing
better business by concentrating their efforts
in markets that provide higher profit margins.
Nevertheless, as the technologies and the primary
markets mature, developing countries' markets
will start looking more lucrative.
boom set to accelerate
Wireless telecommunication is a superior technological
choice for places which do not have fixed line
networks installed. In fact, it appears to be
the preferred choice everywhere. The leading
country in mobile communications, Finland, has
today a mobile phone penetration rate well above
40 per cent of the total population. In 1997,
one Finn in five bought a new phone or replaced
an old model. Old predictions of saturation
rates, first 40, then 60 per cent of the population
have been scrapped. Other European countries
are also becoming "mobile" very fast, and the
United States market is also picking up. In
Asia and the Pacific, the trend is similar --
in cities and countries with high purchasing
power. Major cities in the region already have,
or are developing mobile phone networks.
Travellers have been able to use international
roaming features of digital mobile phones for
a couple of years. The most popular standard,
GSM, is used in over 100 countries, of which
about half are providing international roaming
services. However, the coverage of any one mobile
phone standard is not truly global. The first
dual band digital phones are on the market,
and towards the end of 1999, even more versatile
phones have been promised, adding to the above
compatibility with analog mobile networks.
Although it is possible in many countries to
receive and send short Internet messages with
a mobile phone, that technology has been suffering
from a grave handicap, a very low speed of data
transfer. Most mobile phone networks today can
transfer data only at 9600 bits per second.
With the use of a new ETSI (European Telecommunications
Standards Institute) specification, multi-slot
high speed data service (HSCSD for High Speed
Circuit Switched Data), GSM networks can soon
bring data speeds up to 57.6 kbit/s with a four
time-slot air interface implementation. By the
turn of the millennium that speed will be doubled
to some 115 kbits/s with the emergence of General
Packet Radio Service.
The next generation of digital mobile phones
will reach even higher data speeds, which was
proved in a prototype demonstration recently
by Ericsson. Data transfer rate reached 384
kbit/s, a speed capable of handling video conferencing,
not to mention more ordinary Internet services.
A pocket video phone? Yes, please. When? Not
so long after Y2K.
The integration of Internet services and mobile
phones requires the development of new applications
and protocols for network operators, which are
not necessarily transparent to phone users.
A group of leading mobile phone companies has
developed a new Wireless Application protocol
(WAP) as a unified global solution for existing
and future value-added services in wireless
networks. Apart from specifications for transport
and session layers, and security features, WAP
also defines an application environment including
a microbrowser, scripting , value-added telephony
services and content formats. Services can be
created from single-line text displays in standard
digital mobile phones to highly sophisticated
smart phone displays. (For details, see http://www.wapforum.org/.)
The leading mobile phone manufacturers (Ericsson,
and Nokia, shortly to be joined by Motorola)
recently announced a joint venture with Symbian
Ltd, to develop its non-proprietary EPOC operating
system to suit smartphones and communicators.
Apparently that will open direct competition
with Microsoft's Windows CE, which is a leading
operating system in small palmtop computers.
In the latest major 'wireless' announcement,
a group of five leading IT and telecommunications
companies recently get together to develop a
specification for a new device ("Bluetooth")
that enables short distance wireless communications
between computing devices which are now often
incompatible. Bluetooth devices will be able
to swap information over a range of 12 metres,
which is further than in currently available
infrared devices. The first devices based on
a low cost Intel semiconductor are expected
to be available in the second half of 1999.
E-commerce ready to break through - finally
(e-commerce) can be defined as the paperless
exchange of routine business information
using formal electronic data interchange
or other electronic technologies, including
electronic mail, bulletin boards, and
electronic funds transfer. E-commerce
technologies are designed to reduce or
replace paper-based work flows with faster
and more reliable communications between
Although the basic technology needed for
electronic commerce has been available for several
years, electronic trading has been offered mainly
in selected industries and large corporations
in the United States. But the situation may
change soon as secure transmission and authentication
technologies in the World Wide Web have matured
so that the confidentiality of transactions
can be guaranteed. Credit card companies are
supporting secure electronic transactions, and
several global e-commerce providers have been
established. The future of the text-based formal
EDIFACT or ANSI X.12 standards does not look
so clear any more.
E-commerce has the potential to integrate industries,
small businesses, government agencies, large
corporations, and independent contractors into
a single commercial community across computer
platforms and continents. The volume of e-commerce
is expected to grow exponentially in the next
few years. E-commerce offers a low-cost opportunity
for businesses in developing countries to be
accessed and to sell their products globally.
It can help in acquisition of equipment, raw
materials and better technologies in the private
sector. In the public sector, it can make purchases
more efficient and transparent.
Customers are willing to buy electronically
only if the transaction process is fully confidential
and more convenient than alternative methods.
These conditions are increasingly met with today's
technology, and the various types of goods and
services sold electronically are surprisingly
numerous. Take for instance supermarkets that
are experimenting with Web-based ordering of
groceries, combined with home/office delivery
Let us take a look at some technology issues
that support an optimistic view of the prospects
Secure Electronic Transaction
Visa International and MasterCard agreed in
February 1996 to develop a single technical
standard, Secure Electronic Transaction (SET)
specifications, for safeguarding payment card
purchases made over computer networks. Several
leading technology companies participated in
the effort, including Microsoft, IBM, Netscape,
SAIC, GTE, RSA, Terisa Systems, and VeriSign.
The SET standard addresses key issues of e-commerce,
namely confidentiality, authentication and integrity
of all transmission and all parties.
In December 1997, Visa and MasterCard formed
SET Secure Electronic Transaction LLC (commonly
referred to as "SETCo") to implement the SETT
Secure Electronic Transaction 1.0 specification
(released in June 1997). American Express and
JCB Credit Card Company are expected to join
soon as co-owners of SETCo.
Pilot applications are being tested in several
countries, including Singapore. In June 1998,
the first four vendors ( GlobeSet Inc., Spyrus/Terisa
Systems, Trintech, and VeriFone, Inc) successfully
completed their pilots and were allowed to use
SET-compliant trade mark software for their
software. Although initially implemented in
software, vendors are currently developing SET
hardware implementations for Certificate Authorities,
Payment Gateways and Merchant Servers.
Public key encryption
A fully-fledged e-commerce transaction requires
several separate data transmissions between
customer, merchant and credit card company.
To protect confidentiality, all transmissions
must be encrypted.
The SET standard uses the public key encryption
method, of which the first versions were developed
by mathematicians in the 1970s. The method was
revived in 1994 when Philip Zimmermann released
his 128-bit public key version, "Pretty Good
Privacy - PGP". Ever since, the issue of public
key cryptography has been hotly debated in the
United States. The Government still opposes
exports of encryption software with longer than
64-bit encryption keys. (The commercial versions
of PGP, ViaCrypt PGP v2.7.1 and v4.0 support
up to 2048-bit RSA keys). It reasons that the
almost perfect security of long public key encryption
could be used to conceal e-mail and data traffic
that supports criminal and terrorist activities.
In public key encryption, customers, merchants,
and credit card companies know each others'
public keys; they might have obtained them by
ordinary e-mail, which is not secure at all,
or from each other's public Web sites. In addition
they have their own secret keys which they will
not reveal to anyone. They use their own secret
key and the other party's public key to encrypt
the data to be transmitted. The receiver opens
the message with the sender's public key and
own secret key. In other words, the public key
and the secret key work in tandem and must be
Public key encryption is almost impossible
to open. Cracking of a 128-bit PGP code, for
instance, would require so much computer power
that nobody would even try to crack normal business
transactions. The downside of public key encryption
is that the process is slower than some other
In practice, public key encryption compromises
security only if the secret key falls in wrong
hands. Alternative methods are based on symmetric
keys (the same key encrypts and opens the message),
and the probability of the key getting into
the wrong hands is much greater. Besides, it
would be impracticable for large enterprises
to dedicate a unique key to each customer.
Before a transaction takes place, parties need
to be certain about the identity of the partner(s).
A proper e-commerce transaction therefore requires
several authentication checks:
- The customer is really
dealing with an intended merchant
- The merchant is certain
about the identity of the customer
- The credit card company
is certain about the customer's and merchant's
In the SET model, SETCo acts as a "root" certificate
authority, and issues digital certificates
to payment card brands that wish to participate
in SET. At the time of transaction, SET compliant
software validates the authenticity of the trading
partners by examining each other's digital certificates.
The certification authorities need to be trusted
third parties that in no circumstances would
release confidential information. They must
assure a timely remedy in case private keys
are compromised, through invalidation of the
certificate associated with a compromised key
Many banks are expected to start offering digital
certificates to merchants and cardholders in
mid-1998. Cardholders and merchants will be
able to request digital certificates from participating
banks via the Internet. A certificate authority
will issue the digital certificates electronically
after validating the required information.
Chronologically, the first condition for electronic
transactions is the issuance of digital
certificates to each transaction participant.
This is a one-time procedure for first-timers.
The customer browses product catalogues that
are made available by merchants or e-commerce
providers. Product information may be public,
in which case a normal Web server is sufficient
to service customers. Confidential product information
can be protected by various user authentication
procedures, passwords being the simplest and
digital certificates providing higher security.
Some merchants provide an interface from which
customers can conveniently place items into
a "shopping basket" while browsing, and then
make a purchase request for all the
items. To have a purchase request sent, the
buyer needs to use his client software to send
his credit card information to the merchant's
server for authorization.
The merchant's server processes the purchase
request and sends a payment authorization
request to the merchant's bank, where the authorization
request is handed over to the traditional credit
card network (Visa, Mastercard etc.). The response
(accept/decline) is returned to the merchant
bank's payment gateway, and on to the merchant's
Payment settlement is done in a traditional
way between the customer's bank and merchant's
Obviously there are variations of e-commerce
models. Many are not as automated as SET and
do not require the customer's certification,
they may be based on product displays on Web
server, and use traditional payment methods,
such as account transfers, cheques, or fax transmissions
of credit card numbers.
EDI is still alive
It is not possible to finish this article without
mentioning electronic data interchange (EDI).
There are two main standards; UN/EDIFACT which
is dominant everywhere else except in the USA,
where an older ANSI X12 standard still prevails.
EDI messages are used for processing the logistical
information (orders, invoices, bills of lading
etc). Although rather widely used, EDI standards
have some serious shortcomings. They were created
at the time of telex technology for character-based
messaging. Therefore it is not surprising that,
especially with improved security features,
merchants prefer to use the more versatile Web
as their product presentation platform. Images,
sounds, and video sell better than plain words,
and it looks like the formal EDI standards will
have to be adapted to the interactive cyber
world, one way or the other.
More about e-commerce:
Electronic commerce initiatives
of ESCAP: Business facilitation needs. Studies
in trade and investment, no. 31. United
Nations Economic and Social Commission for
Asia and the Pacific. 1998.