Tech Reflections
When I started receiving congratulations messages on my work anniversary, it struck me just how fast the last 525,600 minutes have passed. So much has changed, and yet much remains the same.
What has changed? After nearly 13 years, I left Deloitte Consulting, LLP to join the MITRE Corporation; I had honestly never thought that I would leave Deloitte. While I miss my friends and colleagues @ Uncle D’s, I’ve gained anew with MITRE. Steve Foote, Paul Vencill, Dr. Bob Cherinka, Rick Cagle, and Pat Benito are some of the smartest, seasoned technologists I’ve ever known. Each has a true passion for solving problems for a safer world. My work and contributions at Deloitte directly shaped who I am and have given me unique insights. When blended with the imperatives of an FFRDC, I find myself leaning in from a software architecture, engineering, and DevSecOps perspective across defense and civilian federal and state governments. My vantage point may have changed and the tech trends have continued to evolve.
There are myriad topics to talk about! For this musing, I’ll limit myself to a small sampling of what has changed in the last year like DataOps, Data Virtualization, Low Code/No Code, and Software Bill of Materials (SBoM). I’ll also give my take on what remains the same: the need for solid Software Architecture and DevSecOps.
What has changed? DataOps, DataOps, DataOps! This is NOT DevOps for the database. As we shifted workloads to the cloud, we had originally anticipated cloud native refactoring of apps as the next logical step. What we saw was a realization on the power cloud
brings to analytics, AI, and MI! This demand explosion means we need new and different “SDLC” approaches. New environments and controls are needed to enable data scientists and data engineers to experiment and develop algorithms. New techniques for managing, testing, and operationalizing are needed as well. Some of the new players in this space, like DataKitchen, are thinking through innovative approaches and what Data pipelines look like?
Data Virtualization is another massively important trend that is here to stay. Moving all of your data to a single location can be a cyber and security risk given the attack surfaces being generated. I often say that data is the greatest asset of the enterprise and you need to treat it like your finances! You don’t have all your money in a coffee can in the backyard (at least I hope you don’t), and in the same way, all of your data is not in the same place. You do, however, know exactly where every penny is located and how it is being capitalized and managed. Isn’t that what we need with our data? Data virtualization is also a massively helpful technique and tool for modernization enabling you to surface and use old data in old stores in new ways with the nuanced details hidden away. When you are ready to deprecate or update the old store, the data consumers have no idea or cares; the change can be essentially un-impactful.
The adoption of Low Code and No Code platforms (LCNC). We’ve been chasing this for nearly two decades and now technology has caught up with our ideals! I am clearly a custom dev junkie and yet I realize our enterprise ecosystems need to use Low Code / No Code tools. There are some risks with the concept of “citizen developers” — functional folks making apps — but with the help of IT, safe adoption of LCNC can accelerate delivery, enable offloading of repeatable types of workflows to domain experts who can configure to their needs. With a little partnering from IT to make sure DevSecOps scaffolding (cyber, quality, release) is in place, LCNC platforms like Appian, Microsoft Power Apps, Mendix, and others are enabling the enterprise.
What about the ongoing explosion of software factories?! This is both positive and negative. If you can go faster and provide jumpstarts, how could this possibly be a bad thing? Software factories are vogue and can absolutely help minimize unnecessary tool sprawl and give much needed data on the value stream, there is also a risk of black boxing away from the engineers and developers. Be judicious, transparent, and most of all, make sure you have a cadre of black belts on the inner workings of the SWF and the leading practices for using them. The need to understand the pedigree of every single component and line of code is another risk I’ve been studying thanks to the Software Bill of Materials work (SBoM) by my MITRE colleague, Bob Martin. Containers are still the golden hammer and having a container strategy is not just smart, it is an imperative.
What remains the same? The need for solid software architecture! I have witnessed first hand projects that use the term agile and banter about emergent architecture only to be caught flat footed from a security perspective, scalability perspective, and ability to achieve mission value. The trendy phrase, “emergent architecture,” was coined for small scale applications that leverage industry frameworks and are able to adopt published reference architectures. In those instances, there is an inherited or prescribed architecture that “emerges” during the development and delivery. It was not intended to apply to complex system of system efforts where the cost and practicality of technical debt and risk need to be calculated and addressed.
Like the term “requirements,” architecture must not be a forbidden word. Experienced architects and engineers have a specific responsibility to mentor development teams and provide guidance and direction. By establishing the scaffolding and guidance, delivery teams can thrive providing new business features and supporting technical capabilities. My colleagues Scott Bucholtz and Ken Corless at Deloitte recently published a relevant thought leadership piece called “Architecture awakens” that I highly recommend..
What else remains the same? The need to apply DevSecOps principles using a risk based approach. Don’t “do DevOps” because it isn’t a single thing! DevOps/DevSecOps is a set of practices that enable quality, stability, appropriate speed, and security. What you do and where you focus,needs to be based on your biggest risk areas. This is especially true when applying to existing efforts and not greenfield. Have a quality problem? Double down on automated testing and test data management (TDM). Have a stability issue? Work through infrastructure as code (IAC) for inherent auditability. Be intentional! Be pragmatic!
A week or so ago, I listened to a leading manufacturer admit to banishing the term agile from their lexicon. Why? It is an overloaded term carrying much too much baggage, opinions, and power struggles. Probably the most inspirational and pragmatic statement made was that they are organized to move fast but not at the expense of quality and getting it right the first time. Constant refactoring may work for an isolated mobile application and as a code quality technique but it fails when applied at scale. The role of the architect is guide these types of decisions.
So much to talk about! So after a year of change, after 525,600 minutes of personal and professional change, I’m emerging to share my observations, my experience, and my thoughts. What topics interest you? Let’s connect! #Cloud #CloudNative #DevSecOps #DataOps, #SoftwareArchitecture #SoftwareEngineering #SBOM, #LowCodeNoCode #LCNC #Rockitect #MITRE
NOTE: My work with MITRE means that I must remain objective and provide technical recommendations that best meet the sponsor/client challenges. Any software, technology, or vendor mentioned are for illustrative purposes only and are not to be construed as an endorsement or partnership. @ MITRE, we discover, we create, and we lead!