Posts

Better Architecture Everyday - A 1h interview with Gunnar Menzel

Image
Paul Preiss, CEO & Founder of IASA Global interviews Gunnar, Chief Technology and Innovation Officer for Capgemini Europe. Paul Preiss will do two short interviews with Gunnar on: - The evolving role of Solution Architect in the post-Covid era. - Agile Enterprise Architecture; Myth or reality? Paul and Gunnar will discuss the paradigm shift in business expectations and models due to Covid-19 and how the role of a Solution Architect needs to change in the post-Covid-19 era. Access the interview here   They will also discuss how everything is accelerating. Gone are the days where we spend weeks on developing EA artifacts. Today, Architecture has to contribute towards the solution lifecycle in an agile and speedy way, creating and delivering “just enough architecture”. But what do we mean by “just enough”? Thanks for Listening 

Agile Architecture

Image
Agility is a central requirement for many organizations and agility is also impacting the way we, as (IT) Architect's work. We work in an agile way to drive change that creates business opportunity through technology innovation. We will shape and translate business and IT strategy needs into realizable, sustainable technology solutions, whilst taking end-to-end solution delivery ownership from idea to benefits delivery. Being able to respond to change means, that we do not longer create a Big Design Up Front (BDUF) where we define an architecture to be realized over a long period. Instead, the architecture design in an agile context : provides the vision (intentional architecture) where the teams fit in with their (development) work, gives the guard rails between which the agile teams make their own design decisions (emergent architecture), shows where teams need to connect to ensure interoperability between systems/services (solution architecture), provides guidance on generic

Goodbye VPN and Welcome SDP

Image
We all been used to using VPN. An application installed on your laptop or on your mobile device that allows you to access internal applications, whilst traversing an insecure public network. VPN has its advantages; However, the list of disadvantages is long and so in 2007 a new solution was created – the software defined perimeter. In this short blog I will provide an overview on VPN and SDP. VPN VPN (virtual private network) started first with an approach called Peer to Peer Tunnel Protocol (PPTP). The request for  comment (RFC) 2637 being published by a consortium led by Microsoft in July 1996   and the main idea what to create a secure connection between two endpoints , allowing for secure communication between the two end points (typically an end user device and a secure endpoint like a firewall). The connection is also referred to as tunnel.   PPTP is not in use anymore as apparently the NSA managed to enter a PPTP  rendering it unsecur e. As noted, VPN basically creates a

What is Microsegmentation?

Image
Sometimes dismissed as a “security term” and therefore ignored by many, it is a new blueprint that we all (as IT Architects) should understand.  Microsegmentation as a term refers to the ability to segment compute, storage and network into one virtual zone in order to control in and outbound traffic in both north-south as well as east-west direction. The main aim of Microsegmentation is to significantly increase security by containing threats within a small(er) area – Zero Trust approach  Breaches in security are well documented in the press nowadays, and with the increase of digital (in particularly automation and full connectivity) it seems that attacks exploiting unknown vulnerabilities are one of the key threats’ organisation have to protect themselves against. In a 2015 Forrester study (see here ) software exploits are with ~ 37% of all attacks top the list of the most used attacked mechanisms. With the rise of so-called Exploit Kit’s many environments are increasingly at risk

Fog and Edge Computing

Image
Edge and fog computing are closely related – both refer to the ability to process data closer to the requestor / consumer to reduce latency cost and increase user experience. Both are able to filter data before it “hit” a big data lake for further consumption, reducing the amount of data that needs to be processed. The basic idea of edge computing is to move data logic (mainly around data validation / data grammar checks) to an outer ring of capabilities. This is in direct response to the cheer increase of data bandwidth required by end devices and has been fuelled by the explosion of IoT (Internet of Things) which in turn has increased the need to process the generated data much closer to the source in real time. In other words, edge and fog computing push the cloud (read data centre) closer to the requestor to minimise latency,  minimise cost and increase quality.  Let’s use an example A Boeing 787 generates 40 terabytes (TB) per hour of flight, half a TB of which is ultimately t

Our Latest POV Paper: The State of the Art in Agile Software Development

Image
This week we finally managed to release our external POV : The State of the Art in Agile Software Development . What is the latest in agile software development life cycle (SDLC)? This is a question I often get asked, so we decided to create a paper to outline that. Over the past couple of months, together with experts across the entire Capgemini Group, we developed this 22-page agile software development paper focused on a number of key questions : What are the key elements of a sustainable and scaled agile SDLC? What are the main characteristics, the main people, process, and technology related aspects? What about agile IT architecture across SDLC? How do you empower people along the SDLC? How to change the culture and the skills? How do you deploy a top quality, secure and compliant SDLC? How do you make best use of innovative technologies and what does a typical journey look like? To make it as relevant as possible we decided to focus on fin

Event based versus Data based Programming

Image
  Over the years our landscapes have significantly changed, shifting from monolithic to client server to now cloud based microservices architectures :  Along the shift to cloud based microservices, programming styles have also changed and in this short blog I will focus a bit on programming paradigms by discussing : Event based versus Data based Programming When I started programming in the mid-1980, only a small number of so-called “programming paradigms” where available – most notably imperative languages [ 1 ] or better procedural / functional languages like C++ developed around 1980 as well as Modula and Ada.   For me the work that Bjarne Stroustrup did related to programming was hugely influential – his book “The C++ Programming Language” issue in 1985 was in of my first IT books and it was the start of my IT career. Today, 30 years later, there are between 25 and 30 different programming paradigms. Everything from “action” and “agent oriented” to “symbolic” and “Value-level”