How does Airflow schedule Daylight Saving Time?

One morning you find out your favorite Airflow DAG did not ran that night. Sad… Six months later the task ran twice and now you understand: you scheduled your DAG timezone aware and the clock goes back and forth sometimes because of Daylight Saving Time. For example, in Central European Time (CET) on Sunday 29 March 2020, 02:00, the clocks were turned from “local standard time” forward 1 hour to 03:00:00 “local daylight time”....

<span title='2020-10-29 01:26:43 +0000 UTC'>October 29, 2020</span>&nbsp;·&nbsp;2 min&nbsp;·&nbsp;326 words&nbsp;·&nbsp;Joost

Control-flow structure for database connections

With Python, creating a database connection is straightforward. Yet, I often see the following case go wrong, while a simple solution is easily at hand by using the context manager pattern. For database connections, you’ll need at least one secret. Let’s say you get this secret from a secret manager by running the get_secret() method. You also use an utility like JayDeBeApi to setup the connection and you are smart enough to close the connection after querying and deleting the password:...

<span title='2020-10-05 01:26:43 +0000 UTC'>October 5, 2020</span>&nbsp;·&nbsp;2 min&nbsp;·&nbsp;386 words&nbsp;·&nbsp;Joost

Provide Spark with cross-account access

In case you need to provide Spark with resources from a different AWS account, I found that quite tricky to figure out. Let’s assume you have two AWS accounts: the alpha account where you run Python with IAM role alpha-role and access to the Spark cluster; and the beta account where you have the S3 bucket you want to get access to. You could give S3 read access to the alpha-role, but it is more persistent and easier to manage by creating an access-role in the beta account that can be assumed by the alpha-role....

<span title='2020-08-21 01:26:43 +0000 UTC'>August 21, 2020</span>&nbsp;·&nbsp;2 min&nbsp;·&nbsp;413 words&nbsp;·&nbsp;Joost

Upload Gitlab CI artifacts to S3

With GitLab CI it is incredibly easy to build a Hugo website (like mine); you can even host it there. But in my case I use AWS S3 and Cloudfront because it is cheap and easy to setup. The CI pipeline to build and upload the static website is also straightforward with the following .gitlab-ci.yml: variables: GIT_SUBMODULE_STRATEGY: recursive stages: - build - upload build: stage: build image: monachus/hugo script: - hugo version - hugo only: - master artifacts: paths: - ....

<span title='2020-07-05 01:26:43 +0000 UTC'>July 5, 2020</span>&nbsp;·&nbsp;1 min&nbsp;·&nbsp;206 words&nbsp;·&nbsp;Joost

Secure deployment to Kubernetes with a service account

Now that I have a number of pipelines running I would like to deploy these to Kubernetes through a service account. that is quite simple. As an admin user provide resources such as: the namespaces, optionally with limited resources; an isolated service account with restricted access to one namespace; an encoded config file to be used by the Gitlab pipeline. Service Account with permissions The following file serviceaccount.yaml creates the service account, a role, and attach that role to that account:...

<span title='2020-04-28 01:26:43 +0000 UTC'>April 28, 2020</span>&nbsp;·&nbsp;2 min&nbsp;·&nbsp;373 words&nbsp;·&nbsp;Joost