AI News Hub Logo

AI News Hub

Making Maven Builds Security-Aware: AppSec Checks Without CI/CD Drift

DEV Community
Nikolay Kuziev

The problem was never that Maven projects could not run security tools. They could. A pipeline can run tests, Dependency-Check, CycloneDX, and SonarQube with a few commands. A pom.xml can hold plugin blocks. A team can copy a working configuration from one service to another and call it a standard. For a while, that works. Then the small differences start showing up. One service has JaCoCo but does not pass the XML report to SonarQube. Another produces Dependency-Check output only as HTML. One multi-module project generates an SBOM from the root aggregator and misses the shape of the real runtime application. Another pipeline forgets merge request metadata, so SonarQube analysis is technically successful but practically incomplete. That is security build drift. It looks like automation. It behaves like inconsistency. I built secure-maven-extension to solve that problem for Maven projects. Not by replacing the scanners. By making the Maven lifecycle carry the security workflow. A typical Maven CI/CD setup starts like this: script: - ./mvnw test - ./mvnw org.owasp:dependency-check-maven:check - ./mvnw org.cyclonedx:cyclonedx-maven-plugin:makeBom - ./mvnw sonar:sonar For one repository, this is fine. Across many services, it becomes a maintenance pattern nobody really owns. Some configuration lives in CI/CD. Some lives in pom.xml. Some lives in copied documentation. Some depends on environment variables that are not obvious to local developers. Every new service has to rediscover the same setup decisions. The result is not only duplicated YAML. The result is lower confidence. If local runs do not match CI/CD, developers push just to test the security workflow. If reports are produced in different places, security teams waste time normalizing artifacts. If multi-module projects are wired differently, nobody knows whether the SBOM actually describes the deployable artifact. At that point, the build is not security-aware. The pipeline is just calling scanners around it. The usual approach puts too much responsibility into pipeline scripts. CI/CD should be the shared execution layer. It should run clean builds, publish artifacts, enforce gates, and provide auditability. But when CI/CD also owns all scanner configuration, every repository becomes a custom integration point. That makes local development awkward. A developer can run: mvn verify but the pipeline may run a different set of goals, with different properties, report formats, and SonarQube metadata. So the developer cannot fully trust the local result. This is the gap I wanted to close. The Maven command should stay familiar, but the lifecycle should carry the same AppSec behavior locally and in CI/CD. The core rule was this: keep the Maven user experience native, but inject repeatable security behavior into the lifecycle. Developers should not need a separate security script for every service. CI/CD should not need to reimplement scanner conventions. Security teams should not need to explain report paths and plugin settings repository by repository. The build should know how to do the boring parts. That is why this project is a Maven core extension rather than just another command in a pipeline. A normal Maven plugin would still require explicit plugin configuration across projects. That can work, but it does not fully remove copy-paste. A core extension gives an earlier and more powerful integration point. The extension is loaded from: .mvn/extensions.xml Example: io.github.niki1337.securebuild secure-maven-extension 0.1.0 Internally, the extension works during Maven's afterProjectsRead stage. That timing matters. At that point, Maven has read the root pom.xml and module POMs. Packaging is known. Modules are visible. Existing plugins and properties can be inspected. But the lifecycle has not started yet. That is the useful moment to inject conventions. The extension can decide how to configure coverage, Dependency-Check, CycloneDX, and SonarQube before phases like initialize, package, and verify execute. The extension connects the tools teams already know: jacoco-maven-plugin for coverage; sonar-maven-plugin for SonarQube analysis; dependency-check-maven for dependency risk reports; cyclonedx-maven-plugin for SBOM generation. The developer still uses Maven: mvn package mvn verify mvn sonar:sonar The difference is that these commands become security-aware. For example: mvn package can build the application and generate a CycloneDX SBOM. mvn verify can run tests, generate JaCoCo coverage, and execute Dependency-Check. mvn verify sonar:sonar can send SonarQube analysis with branch, merge request, binary, and coverage metadata already prepared. That is the whole point: the workflow feels like Maven, not like a pile of scanner commands glued around Maven. Real environments are messy. Local developers may use -D... properties. CI/CD usually provides environment variables. Some stable project defaults belong in pom.xml. The extension supports all of those sources: environment variables; Maven user properties; project properties from pom.xml; system properties. A project can define stable defaults: payment-api payment-api Payment API CI/CD can provide secrets and environment-specific values: export SERVICE_NAME="payment-api" export SONAR_HOST_URL="https://sonarqube.example.com" export SONAR_PROJECT_KEY="payment-api" export SONAR_TOKEN="token-value" export DT_API_URL="https://dependency-track.example.com" A local developer can override when needed: mvn verify \ -Dsecure.serviceName=payment-api \ -Dsonar.projectKey=payment-api The goal is not to force one configuration style. The goal is to make the resolved behavior consistent. Coverage is one of those details that quietly breaks AppSec workflows. SonarQube can run without coverage, but the result is weaker. JaCoCo can generate a report, but if XML output is missing or the path is not passed to SonarQube, the analysis is incomplete. The extension injects JaCoCo for Java jar and war projects when JaCoCo is not already configured. It wires the lifecycle like this: initialize -> jacoco:prepare-agent verify -> jacoco:report The XML report is generated at: target/site/jacoco/jacoco.xml Then the extension passes that path into: sonar.coverage.jacoco.xmlReportPaths This is not exciting work. That is exactly why it should be automated. Repeated boilerplate is where drift hides. A common mistake is treating SonarQube setup as three variables: URL, project key, token. For Java services, useful analysis also depends on source paths, test paths, compiled binaries, coverage XML, branch metadata, and merge request metadata. The extension prepares properties such as: sonar.sources sonar.tests sonar.java.binaries sonar.java.test.binaries sonar.coverage.jacoco.xmlReportPaths sonar.exclusions sonar.test.exclusions sonar.cpd.exclusions sonar.coverage.exclusions In GitLab merge request pipelines, it can map CI variables into pull request analysis: CI_MERGE_REQUEST_IID -> sonar.pullrequest.key CI_MERGE_REQUEST_SOURCE_BRANCH_NAME -> sonar.pullrequest.branch CI_MERGE_REQUEST_TARGET_BRANCH_NAME -> sonar.pullrequest.base For normal branch pipelines, it sets branch analysis metadata. This is the kind of logic that becomes fragile when copied into every pipeline file. Inside a core extension, the behavior is versioned and reusable. Dependency-Check is most useful when the output is predictable. The extension injects Dependency-Check into the lifecycle: single-module: verify -> dependency-check:check multi-module: verify -> dependency-check:aggregate It standardizes report formats: HTML JSON SARIF XML and writes reports to: target/reports/dependency-check By default, it disables network-dependent analyzers such as RetireJS, Node audit, Node package analyzer, OSS Index, and hosted suppressions. In restricted CI/CD environments, depending on external services can make builds slow, flaky, or inconsistent. When an internal mirror exists, the extension can use it through: DT_API_URL=https://dependency-track.example.com mvn verify The default adoption path is visibility first. The build can generate reports without immediately failing by CVSS score. After the team understands the findings and noise level, policy gates can become stricter. That is how I prefer to roll out AppSec checks: start with reliable data, then enforce deliberately. An SBOM is not useful just because it exists. It should describe the thing the team actually ships. For a single-module Maven application, the extension can run CycloneDX during package: package -> cyclonedx:makeBom Reports are written to: target/reports/cyclonedx The SBOM focuses on compile and runtime dependencies and avoids test, provided, and system scopes. Multi-module builds need more care. The root project is often only an aggregator. Generating an SBOM there can be less meaningful than generating it from the deployable application module. For Spring Boot projects, the extension looks for: org.springframework.boot:spring-boot-maven-plugin If it finds a deployable Spring Boot module, it injects CycloneDX there. If not, it falls back to aggregate SBOM generation on the root: package -> cyclonedx:makeAggregateBom This keeps SBOM generation tied to the application shape instead of blindly producing a file wherever Maven happens to start. Multi-module Maven builds are where simple CI snippets start to fall apart. A root project may have pom packaging. Modules may be jar or war. Some modules are deployable, some are libraries, some are test fixtures. Coverage should be generated per Java module. Dependency-Check may need aggregate behavior. SonarQube needs paths that reflect the whole project. The extension treats a build as multi-module when Maven sees more than one project and simple mode is not forced. It includes Java modules with jar and war packaging and supports filters like: api,service test-fixtures In multi-module mode, it configures SonarQube on the root, injects JaCoCo into Java modules, adds module-level paths, runs aggregate Dependency-Check, and generates SBOM output from the most useful module when possible. That is the difference between running a scanner and owning a build convention. Once Maven owns the conventions, CI/CD can stay small. A security job can be simple: security:maven: image: eclipse-temurin:17 stage: test script: - ./mvnw -B verify artifacts: when: always expire_in: 7 days paths: - target/reports/dependency-check/ - target/reports/cyclonedx/ - "**/target/reports/dependency-check/" - "**/target/reports/cyclonedx/" - "**/target/site/jacoco/" SonarQube can run only when a token is available: sonarqube:maven: image: eclipse-temurin:17 stage: test script: - ./mvnw -B verify sonar:sonar rules: - if: '$SONAR_TOKEN' The pipeline is now readable because the security wiring is no longer scattered through the YAML. The build owns the behavior. CI/CD runs it. The Maven extension is not the earliest security layer. For secrets, I want feedback before the commit exists. That is where pre-commit and Gitleaks fit. A local secret scanning hook can stop obvious leaks before code leaves the developer machine. The Maven extension handles the next layer: build-time checks that understand Java, dependencies, coverage, SBOM generation, and SonarQube metadata. The model is layered: before commit pre-commit hooks, Gitleaks, fast file checks local build mvn verify, coverage, Dependency-Check, CycloneDX SBOM CI/CD same Maven lifecycle, artifacts, gates, enforcement This is important because not every check belongs in the same place. Secret scanning is fast and high-impact, so it belongs very early. Dependency analysis and SBOM generation are heavier and build-aware, so they belong in the build. Final enforcement belongs in CI/CD. That separation keeps the workflow practical. Developers get to keep using Maven. They do not need to memorize a custom AppSec script for every repository. They do not need to push a branch just to learn whether Dependency-Check or SonarQube wiring works. They can run familiar lifecycle commands and get security-aware behavior locally. That makes findings easier to understand. The report appears in the same build context where the code was changed. This matters because good security tooling is not only about detection. It is also about timing, clarity, and trust. The security team gets fewer custom integrations to chase. Reports are generated in predictable formats and locations. SBOM scope becomes more consistent. Coverage is wired into SonarQube. Merge request metadata is handled in one reusable layer. Multi-module projects stop being a special case every time. This also makes policy easier to evolve. The team can start with visibility, collect reports, understand noise, and then introduce stricter gates when the data is reliable. That is much better than turning on hard failures before anyone trusts the output. secure-maven-extension is not another security scanner. It is a build tooling layer for Maven-based Java projects. It moves repeated AppSec wiring out of CI/CD YAML and into the Maven lifecycle, where developers can run it locally and CI/CD can reproduce it cleanly. The larger pattern is the same one I use across the whole workflow: local hooks for fast mistakes build tooling for repeatable project checks CI/CD for shared verification and enforcement When that pattern works, security stops being an external script attached to the project and becomes part of how the project is built. That is the real goal. Project links: Secure Build Maven Extension Secure Build Gradle Plugin