One of the most important habits a successful IT consultant forms is keeping an eye on which skills are in demand in the marketplace. It’s not easy to collect the data – one cannot simply peruse DICE, for example – one must then determine which skills are required as part of a fad, and which will provide a basis for long-term success.
As a successful independent consultant for much of the last twenty years, I developed this habit. One of the techniques I used was to check on certain skillsets and what they were being requested for, to see what the pattern of usage was. If I saw one that had potential, I would invest time in learning how it worked, and for some of them, how to implement it. I’d like to discuss one today that will be in demand for the next decade: coding to event-driven design, or event-driven architecture.
Event-driven architecture can be implemented in a number of ways. In the Java world, it most often means employing a command query responsibility segregation (CQRS) pattern over a messaging application, such as Apache Kafka, written as microservices. The CQRS pattern segregates the operations that read data from the operations that manipulate it. The advantage of the pattern is that it naturally leads to higher throughput and better security than the older create-read-update-delete (CRUD) pattern that evolved along with relational databases. For example, if MySQL is the data source, the query thread can connect to a slave while the update thread connects to the master instance.
Event-driven applications are stateful. They keep track of themselves by logging whatever they do (known as the event log). By logging everything, if the application crashes, they can replay the event log to catch up to where they were and resume from there.
Microservices may need to share information among themselves. In the case of Insight Global, when a candidate applies through our Candidate Portal, that microservice sends out several messages over Kafka, e.g., to our staffing module, letting the application know that a new potential candidate is available to be vetted, and others letting our resume parsing application know that there is a document available to be parsed and indexed. The recruiter picks up the candidate information and the phone, conducts an initial interview (in the literal sense) and records the results. After recording the results of the conversation, the appropriate microservice sends out messages to the staffing application reporting the results. Other recruiters use a different microservice to search for viable candidates, and may come across the recently added candidate and the interview results.
If this were a single, monolithic application, maintaining it would require a significant amount of time be spent in impact analysis for every change; segregating the functionality allows for smaller changes with less unanticipated effects, and lets us scale up only the functions that need it. Using a message broker allows any application that needs a particular message to grab it for its own nefarious purposes, without having to be concerned about application or data engine-specific interfacing.
If you’re interested in learning more about this development approach, Vaughn Vernon’s “Implementing Domain-Driven Design” is an excellent place to start.
-Jared, Infrastructure Manager