Promethues Monitoring high level Architecture
Promethues Monitoring high level Architecture
Home of Shariq Mustaquim on the Internet!
Promethues Monitoring high level Architecture
A lot of customers ask:
Can a CodePipeline GitHub source provide more than just one branch?
CodePipeline can currently only run on a single branch per source action which is defined in the action configuration. You need to specify a git repository and branch name when you create a pipeline, so if a branch name is unknown like GitFlow branching model, it is not possible to create a pipeline in advance.
CodePipeline tied to a single git branch is more of a feature of CodePipeline as the design is more inclined towards Trunk based development. Also, as per the designers of this service, CodePipeline is designed for post-merge/release validation. That is, once your change is ready to be released to production and is merged into your master/main branch, CodePipeline takes over and automatically tests and releases the final merged set of changes. CodePipeline has a lot of features like stage locking, superseding versions, etc. which don't work well for the case where you want to test a change in isolation before it's merged (e.g. feature branch testing or pull request testing.) Therefore there currently isn't a recommended way to do this in CodePipeline.
Gave a training to fellow Amazonians on Jenkins.
I setup a GitHub repo to create the necessary Infrastructure: https://github.com/shariqmus/intro_to_jenkins
Multiple Pipelines
What we often see is customers setting up one pipeline per environment. This sounds sensible, as it respects the boundaries which an account is designed for! Who has a design like this?
The issue is that it also means that you get drift in the definition of these pipelines. Using CloudFormation is a good practice for defining your pipeline in code, but we still see customers creating different templates for different environments. The problem with this is that when you define the change process differently for each environment, you’re going to get different results! As you move across environments, you’re not building confidence as you promote code changes through the environments. You’re using different build artifacts for different environments, and you’re getting slowed down by the need to process this once for each environment.
We’re starting to consider this a bit of an anti-pattern as we see customers struggling with this.
Snigle Pipeline
A better way to set things up is to have a pipeline that resides in your Dev or Prod environment, with different stages for different environments. Who has this sort of design?
This means that any changes being made to your app/service/microservice is visible in one place – you can visualize the entire SDLC. Of course you want to store it in code, and you want to ensure that the right people have the right permissions to change it.
Setting things up this way will increase your release velocity, as everyone can easily see what’s happening to code changes as they pass through the environments. This means that you don’t need to coordinate across team members and don’t need to manage multiple pipelines.
The downside of this approach is the lack of environmental isolation. If your pipeline is defined in a Dev account, those accounts tend to be a little lighter on security so you have the risk of a bad actor introducing malicious code into the pipeline that can be pushed into Production. And if the pipeline is defined in Prod, it is too hard to access to change the earlier stages of the pipeline! So although this pattern will increase your release velocity, it isn’t ideal from a security perspective.
So what is the best practice?
Pipeline in a "Tools" or "Shared" Account
Well, we’re seeing customers have a lot of success with sticking DevOps tools like CD pipelines in their own account. The pipelines in this account can assume cross-account roles to access resources in the Dev, Test and Production accounts. This imposes best security practices as well as operational practices in how you define your pipelines in code and how you manage the end-to-end flow of your changes to your application.
I have 4 out of 5 AWS Certifications today.
| AWS Certified Solutions Architect - Professional (English) Completed |
| Hello Shariq Mustaquim, |
| Congratulations! You have successfully completed the AWS Certified Solutions Architect - Professional exam and you are now AWS Certified. You can now use the AWS Certified Solutions Architect - Professional credential to gain recognition and visibility for your proven experience with AWS services. |
...
| Overall Score: 90% |
| Topic Level Scoring: 1.0 High Availability and Business Continuity: 90% 2.0 Costing: 100% 3.0 Deployment Management: 100% 4.0 Network Design: 85% 5.0 Data Storage: 90% 6.0 Security: 92% 7.0 Scalability & Elasticity: 81% 8.0 Cloud Migration & Hybrid Architecture: 85% |

Setup an MFA for ActiveDirectory with RADIUS (Duo Server)