About Designing Application Technology Stacks
Make better digital investments when designing your application technology stack
Hi there, it’s Niels. 🤗 Welcome to my newsletter. This newsletter helps with the creation of valuable software solutions, making better software investments, and improving the developer experience. Questions? Ask them here.
This edition should help you to evaluate the technology components needed to put together a technology stack for one or more applications. I will cover:
Application architecture patterns
Libraries, frameworks, cloud APIs, and low-code platforms
Seven to eight years ago, I was part of a team with the task to investigate how our company could replace or evolve the current technology stack for building internal and customer-facing web applications.
In that current technology stack:
Most of the data were stored in a relational on-premise Oracle database.
Most business logic was written in PL/SQL. PL/SQL is a procedural language that’s available in an Oracle database.
Oracle Forms was used as the front-end for internal applications.
Customer-facing applications were built using diverse Java web frameworks which evolved through time. There were applications that used diverse and early versions of Apache Struts. Later on, the Spring MVC framework was also introduced.
Some customer-facing applications required more browser-based interactivity. To support that we also built single-page applications using Angular.
The web layer was only a thin layer and called database (PL/SQL) procedures for executing business logic.
That architecture was suboptimal but like every architecture, this was something that evolved over time. Most of the applications were internal applications to support the business. The need for customer-facing applications to improve the service for customers grew over time.
It was hard to unit-test database procedures. It was also a pain to release new versions of web applications to production.
The idea from management was to investigate if we could replace the business logic layer (PL/SQL) and the presentation layer for internal applications gradually by using Java technology (since this was probably more future-proof).
The web team that I was part of was heavily involved in the change initiative since we were already using Java technology. There was limited involvement of the people working on the database and business layer. The web team was a minority and we were overall young, inexperienced, and lacked mentorship. A consultant was brought in with the goal to support us. We were advised to use Vaadin for the front end and Enterprise Java Beans for the business layer.
Long story short, we were part of “an internal battle” we could not win since:
We were a minority and there was resistance to change from the other stakeholders.
The proposed solution by the consultant was in retrospect, not the best solution.
We struggled due to inexperience, and support and the change project died silently.
Looking back, I learned a great deal from a “failed” change initiative like this one. It also helps to look back at how the technology landscape evolved since then.
A lot of things have changed since then regarding technologies:
The web frameworks (or their versions) from that period are seriously outdated.
Many architectures are moving partially or completely to the cloud offering almost unlimited scalability.
Increase and improvement of (cloud-based) tooling to support the development lifecycle e.g. monitoring, logging, CI/CD, …
Low-code (cloud-based) platforms are also rising to create web applications.
We were not aware of the existence of Docker. Docker back then was still in its infancy. Now it is the de facto standard for packaging and deploying applications.
Most of our applications were (modular) monoliths. There was a lot of fuzz about microservices but we knew if we would go that way, complexity would shift towards the management of the microservices.
Other aspects have not changed that much:
Java is still a very popular language especially for writing enterprise back-end code. The language evolved but stayed backward-compatible. More functional language features were added over time. Other languages like PHP, C#, and Python which were popular back then are still widely used.
The three-tier architecture and the hexagonal architecture are both still very relevant.
Every solution should be built based on requirements. Three types of requirements come to mind:
Business requirements: these motivate WHY the business should do something. This is about nailing down what the problem is.
Functional requirements: define the expected behavior of a solution or WHAT should be done to solve the problem defined as a business requirement.
Non-functional requirements: they define the constraints on HOW that solution should be built.
When evaluating a future technology stack, non-functional requirements are key. For every application that should be built, these requirements will be different.
There is no 1-size best-fit technology stack for all applications
Some examples of non-functional requirements:
Operationality: e.g. the chosen technology must be already operated by the team
Testability: e.g. automated testing of business logic
Financial viability: e.g. the total cost of ownership per year
Application Architecture Patterns
The three-tier architecture is still relevant today for organizing applications. The three-tier architecture separates applications into three logical tiers. These tiers can also be physically divided and enable different teams to work on the different layers. A tier can also be subdivided into different logical and physical components.
These different tiers are layered on top of each other:
the top level of the architecture
the driving integration e.g. command line interface or a web application
the middle tier contains the business logic
the bridge between the driving presentation tier and the driven data tier
The bottom level of the architecture
The driven integration
contains the data-access layer that manages access to data storage like a relational database.
Alistair Cockburn proposed the hexagonal architecture in 2005. It’s definitely not new and remains very relevant and useful.
The hexagonal architecture can be viewed as a flexible extension of the three-tier architecture. The application layer containing business logic remains at the heart of the architecture.
All integrations including databases and user interfaces are part of the infrastructure layer. It supports integrations with multiple driving actors and multiple driven actors.
Driving actors: the ones that initiate the interaction like different user interfaces
Driven actors: the ones that are put into motion by the application e.g. a database or a REST API. Both can be used to fetch or store data.
The integrations between the actors and the application layer occurs via ports and adapters.
driving adapters will use a port
an application service will implement the interface defined by the port
both the port’s interface and implementation are inside the hexagon
driven adapters will implement the port
an application service will use it
the port is inside the hexagon, but the implementation is in the adapter outside of the hexagon
Libraries, frameworks, cloud APIs, and low code platforms are all pieces of reusable functionality written by an external party. Their purpose is to help solve common problems in easier ways.
It’s crucial to look at the non-functional requirements when evaluating these reusable components. Their choices have consequences.
Libraries & Frameworks
The technical difference between a framework and a library lies in the inversion of control.
A framework tells the developer what it needs. The framework is in control.
With a library, the programmer is in control. The programmer calls the library where and when they need it.
It is difficult to replace frameworks. A library is easy to be replaced with another library.
I personally favor lightweight frameworks (e.g. Express) with a limited amount of responsibilities. This makes them easier to replace compared to a framework that is used for everything (This is how using Spring feels).
I’ve mixed Groovy, Java, and Kotlin code in the same code base in a past project (I’m not saying you should do that). These three languages can all run in the Java Runtime Environment.
Because of the inversion of control, frameworks are tougher to debug.
I remember times that I spent more time debugging Spring code than my own code that I plugged into the framework.
I prefer to be in control. This improves the understandability of the code. This doesn’t mean I’m against using frameworks. I do favor lightweight frameworks with a limited amount of responsibilities. This makes them easier to replace compared to a framework that is used for everything.
More heavy-weight frameworks also have a steeper learning curve. If you don’t get the framework, you won’t get the codebase.
Cloud APIs & Low-Code Platforms
Contrary to libraries and frameworks, cloud APIs & low-code platforms don’t run in the same runtime. These are external components running in the cloud.
Cloud APIs are similar to libraries and low-code platforms are related to frameworks. With cloud APIs, the developer is in control. Using low-code platforms, the platform is in control and the developer fills in the missing blanks.
Cloud APIs & low-code platforms both offer scalability and security advantages ( they should).
Libraries offer an API to developers so that functionality can be reused. Cloud APIs also offer an API to the developer. Instead of calling the API directly in code, the API should be accessed:
via a network protocol like HTTP(S) and MQTT(S).
and using a message protocol like JSON or XML
A cloud API can be wrapped into a library where the library contains the abstraction for :
connecting to the API
transforming the request and response to data structures that can be understood by the runtime.
The goal of low-code platforms is to enable developers to create technology solutions faster and cheaper and with a limited need for technology skills for building, testing, and deploying functionality.
Low-code platforms dictate the flexibility you have over the end solution. The more flexibility you have, the steeper the learning curve. While you don’t need skills like writing code (including tests) and setting up CI/CD pipelines, be aware that you will need to invest considerable time to learn the platform.
It’s very difficult, often impossible to replace a low-code platform as part of the technology solution. The more responsibilities a platform has as part of the total solution, the more difficult it will be to replace it.
Debugging and testing are also often pain points in low code platforms. Testing expected behavior is most often a manual task. Most often these also don’t offer a way to mock calls to integrations like a database.
Cloud API examples: SendGrid Email API, Twillio Communication APIs, InfluxDB Cloud API for storing and retrieving time series data.
Low-Code Platform examples: Bubble for building web applications, Waylay for automating data workflows, Canonic for building back-ends
It’s easy to make bad choices with such an abundant quantity of different technology blocks.
An application can be logically (and physically) divided into layers.
The three-tier architecture divides an application into a presentation, business, and data layer. The hexagonal architecture separates the application (including business logic) from all types of integrations (including presentation and data access). This makes this type of architecture more flexible.
Libraries, frameworks, cloud APIs, and low-code platforms are all components containing reusable functionality. They all strive to solve common problems in an easier way.
Selecting these components should be done wisely. They all have their advantages and disadvantages. It’s critical to start from the non-functional requirements during evaluation.
Thanks for reading Venture Whisper! Subscribe for free to receive new posts and support my work.