Client-server networking grew in popularity many years ago as personal computers (PCs) became the common alternative to older mainframe computers. Client devices are typically PCs with network software applications installed that request and receive information over the network. Mobile devices as well as desktop computers can both function as clients.
A server device typically stores files and databases including more complex applications like Web sites. Server devices often feature higher-powered central processors, more memory, and larger disk drives than clients.
Client-Server is one of the computer Industries newest and hottest buzzwords. There is no generic definition of client/server as it is used to depist number of nature, developing, and anticipateologies. However the general idea is that clients and servers are separate logical entities that work together Attention over a network to accomplish a task.
Client-server is very fashionable. As such, it might be just a temporary fad; but there is general recognition that it is something fundamental and far-reaching; for example, the Gartner Group, who are leading industry analysts in this field, have predicted that
"By 1995 client-server will be a synonym for computing."
Most of the initial client/server success stories involve small-scale applications that provide direct or indirect access to transactional data in legacy systems. The business need to provide data access to decision makers, the relative immaturity of client/server tools and technology, the evolving use of wide area networks and the lack of client/server expertise make these attractive yet low risk pilot ventures. As organizations move up the learning curve from these small-scale projects towards mission-critical applications, there is a corresponding increase in performance expectations, uptime requirements and in the need to remain both flexible and scalable. In such a demanding scenario, the choice and implementation of appropriate architecture becomes critical. In fact one of the fundamental questions that practitioners have to contend with at the start of every client/server project is - "Which architecture is more suitable for this project - Two Tier or Three Tier?". Interestingly, 17% of all mission-critical client/server applications are three tiered and the trend is growing, according to Standish Group International, Inc., a market research firm.
Architecture affects all aspects of software design and engineering. The architect considers the complexity of the application, the level of integration and interfacing required, the number of users, their geographical dispersion, the nature of networks and the overall transactional needs of the application before deciding on the type of architecture. An inappropriate architectural design or a flawed implementation could result in horrendous response times. The choice of architecture also affects the development time and the future flexibility and maintenance of the application. Current literature does not adequately address all these aspects of client/server architecture. This paper defines the basic concepts of client/server architecture, describes the two tier and three tier architectures and analyzes their respective benefits and limitations. Differences in development efforts, flexibility and ease of reuse are also compared in order to aid further in the choice of appropriate architecture for any given project.
Chapter-2
History & defintion:-
History
The University of Waterloo implemented Oracle Government Financials (OGF) in May of 1996. That moved UW's core accounting systems to a vendor-supported package on a Solaris/Unix environment and away from locally developed package(s) on IBM/VM. Plans at that time were to move more (if not all) business systems to a single vendor and to standardize on a single Data Base platform (Oracle for both). A very large state of the art Solaris system was purchased with the intention of co-locating these other Oracle supplied services on the same system with the OGF. Network security architecture was planned that involved isolating administrative networks, fire walling those networks with protocol filters and active traffic monitoring. Systems were purchased and deployed to implement that security architecture.
Much has changed in the interim. While the OGF now includes more services beyond the 1996 suite the plans to move all business systems has failed. Notably, we require People Soft/HRMS (Human Resources Management System) for Payroll (deployed in fourth quarter 1998) with People Soft/SIS (Student Information Services) to follow some years hence—Oracle was unable to deliver these key components for our business. Also we've discovered, while it's reasonable to require Oracle as the Data Base when other applications are specified, it's unreasonable to expect that they will be certified with the same versions of the Oracle Data Base and/or the underlying operating system. Technology changes quickly too: the state of the art Solaris system is no longer current. Networks were restructured to isolate administrative systems in the "Red Room" and administrative users throughout the campus. However, the administrative firewall and active traffic monitor was never implemented - recently it's been dismantled.
Jumat, 15 Juli 2011
Langganan:
Posting Komentar (Atom)
Tidak ada komentar:
Posting Komentar