Everyone can contribute! Let's learn together in a weekly cafe ☕
We love to break things, make mistakes, debug, analyse, fix problems together. Live and unfiltered on Youtube.
Community members and thought leaders regularly join and share their projects and ideas.
"Everyone Can Contribute" is inspired by GitLab's mission.
7. Cafe: Docker Hub Rate Limit: Mitigation, Caching and Monitoring
We went from a quick introduction of the Docker Hub Rate Limits into a thoughtful discussion round of whom is affected and how to mitigate further. Every CI/CD pipeline and job being run in a container is affected, and likewise deployments run in containerized environments such as Kubernetes clusters.
Next to caching proxies, we will see a need for building and maintaining your own Docker images. Following the OCI specification, this needs a central repository hosting the Dockerfile definitions, CI jobs for building, tagging and pushing a local container registry, e.g. in GitLab.
There is no clear prediction on whether Docker CLI or podman or $tool will win. As a developer, we probably do not care as they all implement an API specification and “something” does “something” to format, build, test, deploy our code.
We’ve also discussed the changed image names (short image names vs. absolute registry URLs) and the changes required in every CI/CD config file in every project. Niclas shared their best practice on updating CI configuration files with renovate, similar to a bot updating software dependencies and submitting merge requests.
Building and maintaining container images moves more responsibilities to ops, with security deeply tied into this process. Container scanning will become the standard process: A newly tagged image renders older versions potentially out-of-date and needs automated scanning for vulnerabilities and needed updates.
Conclusion was to pick the solution which fits best into the current environment. Invest into a local container registry and own images, with integrated automation and security. If you cannot afford the resources now, invest into the Docker Hub pro tier and buy some time to schedule an infrastructure shift next year.
Enjoy the session! 🦊
- Docker - check your current pull rate limits with curl
- Mitigate Docker Hub Limits with image caching
- GitLab Dependency Proxy moving to Core
Docker Image Creation
- DinD: https://github.com/docker-library/docker/issues/38
- K8s executor in GitLab Runner: https://gitlab.com/gitlab-org/gitlab/-/merge_requests/46612/diffs
Ideas and discussions
GitLab Infra Insights:
Build own Images
As a Developer, I do not care about Docker, Podman, or others. It is just an API. Similar to CI/CD executors.
How to build container images - is it easy/hard?
- OCI Specification
Rebuilding images, it can be painful to migrate from the short Docker Hub URLs to the longer URLs. Create a CI job, build your image and push it to the registries. Find a good way to update the existing .gitlab-ci.yml configuration file with the
Pull images, tag them, push them to the GitLab registry. This is a PoC: https://gitlab.com/greg/docker
Use renovate to update the image in the CI config
Automate this in a project :)
We also need to take care about security for containers, e.g. Container Scanning in GitLab
You can edit
/etc/docker/daemon.jsonand add the
registry-mirrorskey shown in the blog post.
- Monitoring Plugin written in Python 3 by GitLab Developer Evangelists - draft blog post MR](https://gitlab.com/gitlab-com/www-gitlab-com/-/merge_requests/66940/diffs)
- Prometheus Node Exporter? Keep the scrape interval high, every pull counts.
- Are there better APIs available or do we just migrate away from Docker Hub?