Since the early 1990s, I’ve been watching with interest and intrigue the rapid evolution of the software industry and its developers, which ushered in the Internet age in the meantime. I’d like to share with you my observations on that subject. Because of the breadth and complexity of the related topics, I’ll chronicle them in a series of posts, starting with this one.
Up until the mid-1990s, software developers routinely created products and custom applications from scratch, hard-coding them in an imperative programming language, be it Assembly, C, C++, FORTRAN, Pascal, Java, or the like. The process typically started with IT deciding on the scope, requirements, and technology choice for the project, followed by developers taking the assignment, writing up a functional specification, and implementing it. Developers were largely coders then, with predefined responsibilities and relatively limited authority.