Skip to main content

Build and Packaging Architecture

This document describes the build system architecture for dqlitepy, including native library compilation, vendored dependencies, Docker-based builds, and wheel packaging.

Overview

dqlitepy uses a multi-stage build process to create a portable Python wheel that includes vendored C libraries (dqlite and raft) and a Go shim layer for FFI.

Build Stages

Stage 1: Vendor Library Build

Compiles dqlite and raft C libraries from vendored sources.

Build Script: scripts/build_vendor_libs.sh

#!/usr/bin/env bash
set -euo pipefail

VENDOR_DIR="vendor"
BUILD_DIR="${VENDOR_DIR}/build"
INSTALL_DIR="${VENDOR_DIR}/install"

# Build raft
echo "Building raft..."
cd "${VENDOR_DIR}/raft-0.10.0"
autoreconf -i
./configure \
--prefix="${PWD}/../install" \
--enable-static \
--disable-shared \
--enable-debug=no \
CFLAGS="-O3 -fPIC"
make -j$(nproc)
make install

# Build dqlite
echo "Building dqlite..."
cd "${VENDOR_DIR}/dqlite-1.18.3-fixed"
autoreconf -i
./configure \
--prefix="${PWD}/../install" \
--enable-static \
--disable-shared \
--enable-debug=no \
CFLAGS="-O3 -fPIC" \
PKG_CONFIG_PATH="${PWD}/../install/lib/pkgconfig"
make -j$(nproc)
make install

Key Configuration:

  • --enable-static: Build static libraries for linking into Go
  • --disable-shared: Don't build .so files (not needed)
  • -fPIC: Position-independent code (required for shared libraries)
  • -O3: Maximum optimization level

Stage 2: Go Shim Build

Compiles the Go shim that wraps go-dqlite into a C-compatible library.

Build Script: scripts/build_go_lib.py

#!/usr/bin/env python3
import os
import subprocess
import platform
from pathlib import Path

def build_go_library():
"""Build the Go shim library."""
project_root = Path(__file__).parent.parent
go_dir = project_root / "go"
vendor_install = project_root / "vendor" / "install"

# Determine platform
system = platform.system().lower()
machine = platform.machine().lower()

if system == "linux" and machine in ["x86_64", "amd64"]:
platform_dir = "linux-amd64"
else:
raise RuntimeError(f"Unsupported platform: {system}-{machine}")

output_dir = project_root / "dqlitepy" / "_lib" / platform_dir
output_dir.mkdir(parents=True, exist_ok=True)

# Set CGO environment
env = os.environ.copy()
env["CGO_ENABLED"] = "1"
env["CGO_CFLAGS"] = f"-I{vendor_install}/include"
env["CGO_LDFLAGS"] = (
f"-L{vendor_install}/lib "
f"-ldqlite -lraft -lsqlite3 -luv -llz4"
)

# Build command
cmd = [
"go", "build",
"-buildmode=c-shared",
"-o", str(output_dir / "libdqlitepy.so"),
"./shim",
]

print(f"Building Go library: {' '.join(cmd)}")
subprocess.run(cmd, cwd=go_dir, env=env, check=True)

print(f"✓ Library built: {output_dir / 'libdqlitepy.so'}")

if __name__ == "__main__":
build_go_library()

Go Shim: go/shim/main_with_client.go

package main

// #cgo LDFLAGS: -ldqlite -lraft -lsqlite3 -luv -llz4
import "C"
import (
"github.com/canonical/go-dqlite/client"
"github.com/canonical/go-dqlite/v3"
)

//export DqliteNodeNew
func DqliteNodeNew(id C.ulonglong, address *C.char, dir *C.char) unsafe.Pointer {
// Create dqlite node
node, err := dqlite.New(
uint64(id),
C.GoString(address),
C.GoString(dir),
)
if err != nil {
return nil
}
return unsafe.Pointer(&node)
}

//export DqliteNodeStart
func DqliteNodeStart(handle unsafe.Pointer) C.int {
node := (*dqlite.Node)(handle)
err := node.Start()
if err != nil {
return -1
}
return 0
}

// ... more exported functions

CGO Flags:

  • CGO_ENABLED=1: Enable C interoperability
  • CGO_CFLAGS: Include paths for C headers
  • CGO_LDFLAGS: Library paths and link flags
  • -buildmode=c-shared: Build as C-compatible shared library

Stage 3: Wheel Packaging

Creates a Python wheel with all compiled artifacts.

Package Configuration: pyproject.toml

[project]
name = "dqlitepy"
version = "0.1.0"
description = "Python wrapper for Canonical's dqlite"
requires-python = ">=3.12"
dependencies = [
"cffi>=1.15.0",
]

[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-cov>=4.0.0",
"pytest-asyncio>=0.23.0",
"pyright>=1.1.0",
"ruff>=0.3.0",
]
sqlalchemy = [
"sqlalchemy>=2.0.0",
]

[build-system]
requires = ["setuptools>=68.0.0", "wheel"]
build-backend = "setuptools.build_meta"

[tool.setuptools]
packages = ["dqlitepy"]

[tool.setuptools.package-data]
dqlitepy = [
"_lib/linux-amd64/libdqlitepy.so",
"_lib/linux-amd64/libdqlitepy.h",
]

Docker-Based Build

For reproducible builds across platforms, we use Docker multi-stage builds.

Dockerfile: Dockerfile

# Stage 1: Build vendor libraries (raft + dqlite)
FROM ubuntu:22.04 AS vendor-build

RUN apt-get update && apt-get install -y \
build-essential \
autoconf \
automake \
libtool \
pkg-config \
libuv1-dev \
libsqlite3-dev \
liblz4-dev \
&& rm -rf /var/lib/apt/lists/*

WORKDIR /build
COPY vendor/ vendor/

# Build raft
WORKDIR /build/vendor/raft-0.10.0
RUN autoreconf -i && \
./configure --prefix=/build/vendor/install \
--enable-static --disable-shared \
--enable-debug=no CFLAGS="-O3 -fPIC" && \
make -j$(nproc) && \
make install

# Build dqlite
WORKDIR /build/vendor/dqlite-1.18.3-fixed
RUN autoreconf -i && \
./configure --prefix=/build/vendor/install \
--enable-static --disable-shared \
--enable-debug=no CFLAGS="-O3 -fPIC" \
PKG_CONFIG_PATH=/build/vendor/install/lib/pkgconfig && \
make -j$(nproc) && \
make install

# Stage 2: Build Go shim
FROM golang:1.22 AS go-build

# Copy vendor libs from previous stage
COPY --from=vendor-build /build/vendor/install /vendor/install

# Install system dependencies
RUN apt-get update && apt-get install -y \
libuv1-dev \
libsqlite3-dev \
liblz4-dev \
&& rm -rf /var/lib/apt/lists/*

WORKDIR /build
COPY go/ go/

# Build Go library
ENV CGO_ENABLED=1
ENV CGO_CFLAGS="-I/vendor/install/include"
ENV CGO_LDFLAGS="-L/vendor/install/lib -ldqlite -lraft -lsqlite3 -luv -llz4"

WORKDIR /build/go
RUN go build -buildmode=c-shared \
-o /output/libdqlitepy.so \
./shim

# Stage 3: Build Python wheel
FROM python:3.11-slim AS wheel-build

# Copy native library from previous stage
COPY --from=go-build /output/libdqlitepy.so /tmp/lib/

# Install build dependencies
RUN pip install --no-cache-dir build wheel

WORKDIR /build
COPY pyproject.toml README.md LICENSE ./
COPY dqlitepy/ dqlitepy/

# Copy native library to package
RUN mkdir -p dqlitepy/_lib/linux-amd64 && \
cp /tmp/lib/libdqlitepy.so dqlitepy/_lib/linux-amd64/

# Build wheel
RUN python -m build --wheel --outdir /output

# Final stage: Extract wheel
FROM scratch AS output
COPY --from=wheel-build /output/*.whl /

Build Script: scripts/build_wheel_docker.sh

#!/usr/bin/env bash
set -euo pipefail

SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(dirname "$SCRIPT_DIR")"
OUTPUT_DIR="${PROJECT_ROOT}/dist"

echo "Building dqlitepy wheel in Docker..."

# Build the wheel
docker build \
--target wheel-build \
--tag dqlitepy-builder:latest \
--file "${PROJECT_ROOT}/Dockerfile" \
"${PROJECT_ROOT}"

# Extract the wheel
docker create --name dqlitepy-extract dqlitepy-builder:latest
docker cp dqlitepy-extract:/output/. "${OUTPUT_DIR}/"
docker rm dqlitepy-extract

echo "✓ Wheel built: ${OUTPUT_DIR}/"
ls -lh "${OUTPUT_DIR}/"*.whl

Dependency Management

Vendored C Libraries

Why Vendored?

  1. Version Control: Lock to specific tested versions
  2. Compatibility: Apply patches for known issues
  3. Portability: No external dependencies to install
  4. Reproducibility: Consistent builds across environments

Vendor Directory Structure:

vendor/
├── raft-0.10.0/
│ ├── src/ # Raft implementation
│ ├── include/ # Public headers
│ └── configure.ac # Autoconf config
├── dqlite-1.18.3-fixed/
│ ├── src/ # dqlite implementation
│ ├── include/ # Public headers
│ └── configure.ac # Autoconf config
├── build/ # Build artifacts (gitignored)
└── install/ # Installation prefix
├── include/ # Combined headers
└── lib/ # Static libraries
├── libraft.a
├── libdqlite.a
└── pkgconfig/

Python Dependencies

Dependency Installation:

# Core installation
uv pip install dqlitepy

# With SQLAlchemy support
uv pip install dqlitepy[sqlalchemy]

# Development installation
uv pip install -e ".[dev,sqlalchemy]"

Go Dependencies

Managed through go.mod:

module github.com/vantagecompute/dqlitepy

go 1.22

require (
github.com/canonical/go-dqlite/v3 v3.0.3
github.com/canonical/go-dqlite/client v1.19.0
)

Dependency Update:

cd go
go get -u github.com/canonical/go-dqlite/v3@latest
go mod tidy
go mod vendor # Optional: vendor dependencies

Platform Support

Target Platforms

Current Status: Linux x86_64 only

Platform Detection:

import platform
import sys

def check_platform():
"""Check if platform is supported."""
system = platform.system()
machine = platform.machine()
python_version = sys.version_info

if system != "Linux":
raise RuntimeError(f"Unsupported OS: {system}")

if machine not in ["x86_64", "amd64"]:
raise RuntimeError(f"Unsupported architecture: {machine}")

if python_version < (3, 12):
raise RuntimeError(f"Python 3.12+ required, got {python_version}")

return True

Build Optimization

Compiler Flags

Recommended Flags:

# For distribution
CFLAGS="-O3 -fPIC"

# For local development
CFLAGS="-O3 -fPIC -march=native -flto"

# For debugging
CFLAGS="-O0 -g -fPIC"

Parallel Builds

Utilize multiple CPU cores:

# Use all available cores
make -j$(nproc)

# Use specific number of cores
make -j4

# Go parallel builds (automatic)
go build # Uses GOMAXPROCS

Testing the Build

Local Build Test

# Build everything
uv run python scripts/build_go_lib.py

# Install locally
uv pip install -e .

# Run tests
uv run pytest tests/

# Check library loading
uv run python -c "import dqlitepy; print(dqlitepy.__version__)"

Docker Build Test

# Build wheel
bash scripts/build_wheel_docker.sh

# Test wheel in clean environment
docker run --rm -v $(pwd)/dist:/wheels python:3.11-slim bash -c \
"pip install /wheels/*.whl && python -c 'import dqlitepy; print(dqlitepy.__version__)'"

CI/CD Integration

GitHub Actions Workflow

Workflow: .github/workflows/build.yml

name: Build and Test

on:
push:
branches: [main]
pull_request:
branches: [main]
release:
types: [published]

jobs:
build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Build wheel
run: bash scripts/build_wheel_docker.sh

- name: Upload wheel
uses: actions/upload-artifact@v4
with:
name: wheel
path: dist/*.whl

test:
needs: build
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: actions/download-artifact@v4
with:
name: wheel
path: dist/

- name: Install and test
run: |
pip install dist/*.whl
pip install pytest pytest-cov
pytest tests/

publish:
if: github.event_name == 'release'
needs: [build, test]
runs-on: ubuntu-latest

steps:
- uses: actions/download-artifact@v4
with:
name: wheel
path: dist/

- name: Publish to PyPI
uses: pypa/gh-action-pypi-publish@release/v1
with:
password: ${{ secrets.PYPI_API_TOKEN }}

Troubleshooting

Common Build Issues

Solutions:

  1. Missing System Libraries:
# Ubuntu/Debian
sudo apt-get install libuv1-dev libsqlite3-dev liblz4-dev

# Fedora/RHEL
sudo dnf install libuv-devel sqlite-devel lz4-devel
  1. CGO Link Errors:
# Check library paths
export CGO_LDFLAGS="-L/path/to/libs -ldqlite -lraft"
export LD_LIBRARY_PATH=/path/to/libs:$LD_LIBRARY_PATH
  1. Go Module Issues:
# Clean and rebuild
cd go
rm -rf vendor go.sum
go mod tidy
go mod download

Summary

Build Pipeline

StageInputOutputDuration
Vendor BuildC sourcesStatic libs (.a)~2 min
Go BuildGo + libsShared lib (.so)~30 sec
Wheel BuildPython + .soWheel (.whl)~10 sec
Total-Installable package~3 min

Key Technologies

  • Autotools: Configure and build C libraries
  • CGO: Bridge Go and C code
  • Docker: Reproducible multi-stage builds
  • setuptools: Python wheel packaging
  • GitHub Actions: CI/CD automation

Best Practices

  • ✅ Use Docker for reproducible builds
  • ✅ Vendor critical dependencies
  • ✅ Build static libraries for portability
  • ✅ Include native libraries in wheel
  • ✅ Test in clean environments
  • ✅ Automate with CI/CD

References