Found 158 repositories(showing 30)
buildkite
An example pipeline that runs and tests a Go project inside Docker using Buildkite.
voidwatch
SMTPHook is a lightweight, container-friendly email ingestion pipeline built in Go. It captures SMTP emails, parses them into structured JSON, and routes them to your webhook or API — perfect for alerting systems, pagers, bots, and custom workflows. Fast setup • Self-hosted • Easily extendable • Works great with Podman or Docker
buildkite
An example pipeline that runs and tests a Go project using Buildkite, without Docker.
hoangsonww
⚙️ A lightweight C-based task VM with content-addressed storage, automatic incremental caching, and terminal-visible dependency graphs; parallel-capable, polyglot (Go, Rust, Python, Node.js, Ruby, assembly) workflow engine with Docker support for reproducible pipelines.
ankursoni
'Roiergasias' kubernetes operator is meant to address a fundamental requirement of any data science / machine learning project running their pipelines on Kubernetes - which is to quickly provision a declarative data pipeline (on demand) for their various project needs using simple kubectl commands. Basically, implementing the concept of No Ops. The fundamental principle is to utilise best of docker, kubernetes and programming language features to run a workflow with minimal workflow definition syntax. It is a Go based workflow running on command line or Kubernetes with the help of a custom operator for a quick and automated data pipeline for your machine learning projects (a flavor of MLOps).
NadeeshaMedagama
Go web service with automated Docker Kubernetes CI/CD pipeline
NadeeshaMedagama
Lightweight Go web service with automated Docker-based CI/CD pipeline
shazforiot
Today we are going to see a complete tutorial on continuous integration and continuous deployment. We will create a pipeline script to clone a git project, compile, test, package using maven and then deploy the war file to a webserver running as a docker container.
DamianFigiel
A lightweight, modular CI/CD pipeline built in Go, automating builds and deployments with GitHub, Docker, and Kubernetes. This tool fetches GitHub commits, builds Docker images from repository code, and deploys them to Kubernetes clusters with rolling updates. Perfect for DevOps enthusiasts looking to streamline deployment workflows.
7erry
HazelCast Jet and IMDG with mySQL Source w/ CDC updates via Kafka-Connect Debezium connector for a scenario with batch and real time updates of securities data in mySQL In this example we want to test out both real time and batch feeds to IMDG and the write-thru' and read thru' capabilities for Hazelcast IMDG and the data pipelining via Hazelcast Jet working against an RDBMS SOR. The RDBMS can be updated by many apps and we want to demonstrate CDC capabilities via Kafka using Debezium CDC MySQL connectors to refresh the IMDG asynchronoulsy and also synchronously via a client app. Setup is with docker-compose with 7 containers: 1 Lenses.io container having single node Kafka/Schema registry/kafka-connect/Kafka REST server + Kafka Management UI 2 Hazelcast Jet/IMDG cluster node containers (hazelcast1, hazelcast2) 1 Hazelcast Jet/IMDG container used for submitting jet jobs (hz_jet_submit) 1 Hazelcast IMDG management center container (mancenter) 1 Hazelcast Jet Management center (hz_jet_mancenter) 1 Mysql source (SOR) DB container (mysql) - this will be our SOR for the demo Use docker-compose to fire up all the above containers: docker-compose -f hazelcast-jet-ent-docker-compose.yaml up -d We get real-time and historic stock market data from Alphavantage Inc. APIs for daily OHLCV (open/high/low/close/volume) for the past 20 years (1999-2019) for 30 stocks in Dow Jones Industrial Average as well as real time data in 1 minute increments for the same stocks over a 7 day window. MySQL container has folder /scripts for data loading which create a securiries_master database and populate tables with stock data above as well as reference data on all S&P 500 stocks. In order to help with editing/debugging of DB scripts without having to respin the containers every time and to persist data locally on host, mysql scripts, conf, and data folders are mapped to a host volume via docker-compose. To run the data loading scripts inside the container: hazelcast-kafka-cdc-test $ docker exec -it mysql bash root@ee8690b41e43:/# /scripts/init-create-load-databases-tables.sh Kafka container has a scripts folder with the configurations for Kafka-connect Debezium connector to mySQL to handle CDC updates from the mySQL source DB. To configure the Debezium connector go to lenses UI on kakfa container by navigating to: http://localhost:3030 (username: admin, password: admin) To configure kafka-connect mySQL debezium connector, navigate to Connectors -> Create New Connector -> Choose "CDC for MySQL" and copy and paste the properties (uncommented part of the file) from the file in the kafka-scripts/debezium-mysql-connector.properties into the UI. The connector should startup and create several topics (one for capturing database wide DDL events, one topic per table for update events for each table). In order to help with editing/debugging of scripts or config files without having to respin the containers every time and to persist data locally on host, scripst and config files are mapped to a host volume via docker-compose. Hazelcast management center is at: http://localhost:8080/hazelcast-mancenter/login.html Setup a login and verify that the cluster hz-jet-ent-cluster is operational with 3 Jet/IMDG nodes (running on ports 5701, 5702, 5703 of the host) Hazelcast Jet management center is at: http://localhost:9090 Use default credentials (admin/admin) to login and and verify that the cluster hz-jet-ent-cluster is operational with 3 Jet/IMDG nodes (running on ports 5701, 5702, 5703 of the host) There is also a test hazelcast jet job provided as a maven project in the kafka2imap folder tree in the repo. This will read the topic "sp500_stocks" and write to a Hazelcast IMap called "securities_,master.sp500_stocks". To build the shaded (fat) jar to submit this job to jet run mvn clean package on the provided pom.xml file in the kafka2imap project folder. This will create a jar with all dependencies included (note: In order to use the jet built in wrapper script to submit jobs from commandline to the ject/IMDG cluster we need to package all dependencies and submit them together or add the dependecies to classpath as part of job submission as there is no guarantee that dependencies will be available on all the nodes in the grid) In order to help with editing/debugging of scripts or config files without having to re-spin the containers every time and to persist data locally on host there is a job-jars folder for job artifacts and a resources folder which has all the script and config files mapped to a host volume via docker-compose. To submit jobs to jet using the submit commandline utility, we have a convenience wrapper script that is run inside the hz_jet_submit container as follows: hazelcast-kafka-cdc-test $ docker exec -it hz_jet_submit bash bash-4.4# pwd /opt/hazelcast-jet-enterprise bash-4.4# cd job-jars/ bash-4.4# ls kafka2imap-1.0-SNAPSHOT.jar run-kafka2imap-job.sh run-wordcount-job.sh test.out bash-4.4# ./run-kafka2imap-job.sh Verbose mode is on, setting logging level to INFO Submitting JAR './job-jars/kafka2imap-1.0-SNAPSHOT.jar' with arguments []
MicrosoftDocs
Sample Go application referred to by Azure Pipelines documentation
olliepop
This is a template for a Go monolithic API development and deployment pipeline. AWS ECS, Docker, Bitbucket Pipelines.
saikrishna-koppula
This repository where we are going to create a project that will use the open source services Kafka & Spark using Docker and perform the activities of an end to end pipeline.
Ark667
Ready to go docker image with Azure Devops agent and some tools to run pipelines.
gianebao
Bitbucket Pipelines Go MySQL Docker image
Deploying a Flask and Go App (3 Microservices) with Kubernetes and DockerHub
colstrom
Build Pipeline for go on Docker
gianebao
Bitbucket Pipelines Go(lang), (Amazon)AWS, MySQL Docker image
pgaijin66
Source code for CICD pipeline for Go application using Docker Scout
Custom Docker image build pipeline for OpenClaw, adding Go, uv, gh, Homebrew, Docker CLI and Chromium
isubho
Go Application with Multistage docker build from Azure devops pipeline and deployed in AKS.
kensspace
jenkins pipeline go through sonar, ut, build, docker, kubernetes deployment, automation testing, security testing
AmirHossenAshraf
Go microservice template with gRPC & REST API, clean architecture, Docker support, and CI/CD pipeline.
MaksymLeus
HostInfo – A lightweight Go backend service for system information, with Docker support, health checks, and automated CI/CD pipelines. Includes semantic versioning, automated Docker builds, and security verification.
terrylin13
A simple demo for learning how to create a containerized Go application using Docker with configure a CI/CD pipeline.
AlexMayka
Production automation platform — 18+ services for analytics, document processing, AI pipelines, and enterprise integrations. Python, Go, FastAPI, Django, LLM, Docker. ~13M RUB/year savings across 5 FTE equivalent.
fzl-22
Automated AI Code Reviews for CI pipeline. Go-powered, Docker-ready, and fully customizable via .reviewer prompts. Currently supports Gemini & GitHub.
kwehen
A "full-stack" to-do application including CI/CD pipeline, Docker, k3s clustering, and more to learn GO and DevOps skills.
VishuPatel-27
This repository contains a simple Go web application with a CI/CD pipeline using GitHub Actions, Docker, Kubernetes, Helm and ArgoCD.
joaomdsg
An example github actions pipeline that uses docker to build, test and push a go application container image to github container registry (ghcr.io)