Intro to Software Architecture
Designing a system to perform well for all its anticipated and unanticipated functions requires us to think about the architecture of that system --and most importantly, the way in which its software operates and interacts.
Don't think you know much about software architecture? Well it's not so different from other kinds of architectural design whether that be in the area of buildings, urban planning, or business processes. And, to know something about modularity is to know something about the architecture of the Internet and the World Wide Web.
Wikipedia: Software architecture is like building architecture in that it is to do with the purpose, themes, materials, and concept of a structure. A software architect employs extensive knowledge of software theory and appropriate experience to conduct and manage the high-level design of a software product. The software architect develops concepts and plans for software modularity, module interaction methods, user interface dialog style, interface methods with external systems, innovative design features, and high-level business object operations, logic, and flow.
Remember way back on the first day when I said that one of the challenges to this course, and to all thinking in the Networking Age, is thinking in the abstract? As we learn more about software architecture you will begin to know more about what I mean.
When we're talking about the Internet and how the World Wide Web works we can't help but speak in the abstract. The Internet is a huge globally interconnected system of data communication consisting of a mesh of computer networks running any millions of applications. No designer sat down to mark out the exact coordinates of each computer on the Internet or the exact applications which would be using it. No, they conceived of what could become an organic, scalable, distributed application by sketching out the concept in the abstract. How could computers communicate easily and efficiently with the least amount of shared knowledge and synchronicity?
For that fundamental coordination which was absoutely necessary, how could we establish basic rules for communication and a standard vocabulary for establishing a shared understanding? This was their challenge and what did they come up with? -the theories of packet switching, burst communications, and TCP/IP. This gave computers a decentralized approach to global communications which, in theory, depended on no central authority. Conceivably, to route information between computers in a globally interconnected network any number of routes through any computers could be used. Then they built prototypes to test the theory and it worked! And then, after working out the bugs and further refining (and publishing) standards and protocols they put their system into practice and it grew! So long as other computers used the simple standard rules for communicating in this fashion they would become part of the wired mesh that was the Internet. What's more is their system did not put undue requirements on how the computers actually worked....only how they communicated. So as long as a computer knew how to exchange data in this way it could pursue its work in any manner independently. It could provide any function to the user. It could use whatever technology its designers wanted.
The Web wasn't so different. Tim Berners-Lee was a software developer at a respected University in Switzerland (CERN). He wanted to devise a collaborative document management system that would solve some problems they were having in collaborating over great distances, with competing schedules, and across different barriers like technology, language, and culture, and so on. So he set out to devise a document collaboration system that created an information space but imposed no great requirements on each contributor. Such a system, he felt, could have little or no central administration, should interoperate across the disparate technologies, cultures, geographies, and time-zones of its far-flung contributors. And yet the coordination of these parts should not restrict the independence of each contributor. He recognized that hypertext and the decentralized architecture of the Internet were partial building blocks which he could use to create this persistent universal information space and he set out to build the rest. He theorized the project - came up with the simple distributed system of addressing, discovering, browsing, creating, and serving pages. He called it the World Wide Web, tested it on a small scale and worked out the well-defined standards and protocols which would allow it to grow organically in all directions while remaining universal and open; respecting the independence and the inclusiveness of each member. To put it into practice, he triggered a network effect to recursively attract content providers, application developers, and users in an ongoing cycle of positive reinforcement. Usage accelerated and the rest is history. We now have an open, developing, universal information space that easily and efficiently, with the least amount of shared knowledge and synchronicity, allows us to share in online content.
The story of the Internet and the Web resonate with our discussion of modularity. Remember modularity? Modularity, the extent to which a system is made up of pieces independent in their own right, which makes for the easy assembly of simple autonomous parts into complex structures, is a hallmark of new softare; software that's built for networking. Systems that claim to be universal, cross-platform, and supportive of new innovations and disparate uses absolutely depend on it. Both the Internet and the WWW are modular by definition and as a result they not only perform their knonw functions well, they support new unanticipated functions and uses.
Modularity is a design characteristic of a broader exercise we can call software architecture. As these stories indicate, the specific way in which we deploy solutions means a lot. Our older computer systems which did little to harness the power of networking used a different approach to building software. Many of them were built based on the view of the computer as a self-contained structure - a closed system with a single central disk, a single central cpu, and a single central memory space. Remember, computers were viewed as computation devices, not communication devices. But with the advent of networking, this monolitich centralist approach gave way to a micro decentralist approach that better fit with the principles of modularity, internetworking, and micro-electronics. This is what software architecture is all about: designing a system to perform well for all its anticipated and unanticipated functions.
Posted by Mark Hemphill on October 14, 2004 | Permalink
I don't know about you guys but I was following along with everything fairly well until yesterdays class. I felt more confused when I left then I have since the course started. Just wondering if anyone else felt this way? Maybe things will get clearer as Chaper 4 progresses or maybe its the thinking in the "abstract" that is the problem for me? Just wondering where the rest of you were at????
Mark, is there clear definitions for the "computer, internet & WWW?"
Posted by: Tracey Gallant | Oct 15, 2004 8:28:13 AM
I agree with you Tracey. Chapters 1-3 were fairly straight-forward, however chapter 4 seems to be a little bit tougher to get a hold on. I'm sure it will eventually sink in though...hopefully anyways.
Posted by: Steve MacLeod | Oct 17, 2004 9:59:51 PM
Posted by: Mark | Oct 18, 2004 8:55:18 AM
The comments to this entry are closed.