Script 'mail_helper' called by obssrc Hello community, here is the log from the commit of package gitleaks for openSUSE:Factory checked in at 2025-07-24 18:54:27 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Comparing /work/SRC/openSUSE:Factory/gitleaks (Old) and /work/SRC/openSUSE:Factory/.gitleaks.new.13279 (New) ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Package is "gitleaks" Thu Jul 24 18:54:27 2025 rev:29 rq:1295524 version:8.28.0 Changes: -------- --- /work/SRC/openSUSE:Factory/gitleaks/gitleaks.changes 2025-06-10 09:09:00.884684873 +0200 +++ /work/SRC/openSUSE:Factory/.gitleaks.new.13279/gitleaks.changes 2025-07-24 18:54:37.312352156 +0200 @@ -1,0 +2,47 @@ +Thu Jul 24 11:01:37 UTC 2025 - Johannes Kastl <opensuse_buildserv...@ojkastl.de> + +- Update to version 8.28.0: + * Changelog + - cant count + - Composite rules (#1905) + - feat: add Anthropic API key detection (#1910) + - fix(git): handle port (#1912) + - dont prematurely calculate fragment newlines (#1909) + - feat(allowlist): promote optimizations (#1908) + - Fix: CVEs on go and go crypto (#1868) + - feat: add artifactory reference token and api key detection + (#1906) + - silly + - Update gitleaks.yml + - add just like that, no leaks + * Optimizations + - #1909 waits to find newlines until a match. This ends up + saving a boat load of time since before we were finding + newlines for every fragment regardless if a rule matched or + not. + - #1908 promoted @rgmz excellent stopword optimization + * Composite Rules (Multi-part or required Rules) #1905 + In v8.28.0 Gitleaks introduced composite rules, which are made + up of a single "primary" rule and one or more auxiliary or + required rules. To create a composite rule, add a + [[rules.required]] table to the primary rule specifying an id + and optionally withinLines and/or withinColumns proximity + constraints. A fragment is a chunk of content that Gitleaks + processes at once (typically a file, part of a file, or git + diff), and proximity matching instructs the primary rule to + only report a finding if the auxiliary required rules also find + matches within the specified area of the fragment. + Proximity matching: Using the withinLines and withinColumns + fields instructs the primary rule to only report a finding if + the auxiliary required rules also find matches within the + specified proximity. You can set: + - withinLines: N - required findings must be within N lines + (vertically) + - withinColumns: N - required findings must be within N + characters (horizontally) + - Both - creates a rectangular search area (both constraints + must be satisfied) + - Neither - fragment-level matching (required findings can be + anywhere in the same fragment) + +------------------------------------------------------------------- Old: ---- gitleaks-8.27.2.obscpio New: ---- gitleaks-8.28.0.obscpio ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Other differences: ------------------ ++++++ gitleaks.spec ++++++ --- /var/tmp/diff_new_pack.7wiTow/_old 2025-07-24 18:54:38.248390962 +0200 +++ /var/tmp/diff_new_pack.7wiTow/_new 2025-07-24 18:54:38.252391127 +0200 @@ -18,7 +18,7 @@ Name: gitleaks -Version: 8.27.2 +Version: 8.28.0 Release: 0 Summary: Protect and discover secrets using Gitleaks License: MIT ++++++ _service ++++++ --- /var/tmp/diff_new_pack.7wiTow/_old 2025-07-24 18:54:38.296392951 +0200 +++ /var/tmp/diff_new_pack.7wiTow/_new 2025-07-24 18:54:38.304393283 +0200 @@ -3,7 +3,7 @@ <param name="url">https://github.com/zricethezav/gitleaks</param> <param name="scm">git</param> <param name="exclude">.git</param> - <param name="revision">v8.27.2</param> + <param name="revision">v8.28.0</param> <param name="versionformat">@PARENT_TAG@</param> <param name="versionrewrite-pattern">v(.*)</param> <param name="changesgenerate">enable</param> ++++++ _servicedata ++++++ --- /var/tmp/diff_new_pack.7wiTow/_old 2025-07-24 18:54:38.324394112 +0200 +++ /var/tmp/diff_new_pack.7wiTow/_new 2025-07-24 18:54:38.332394444 +0200 @@ -1,6 +1,6 @@ <servicedata> <service name="tar_scm"> <param name="url">https://github.com/zricethezav/gitleaks</param> - <param name="changesrevision">c7acf33d962e8effc070072f993c365af19e3661</param></service></servicedata> + <param name="changesrevision">4fb43823ef3d152d239e92d7d5cb04783b548062</param></service></servicedata> (No newline at EOF) ++++++ gitleaks-8.27.2.obscpio -> gitleaks-8.28.0.obscpio ++++++ diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/README.md new/gitleaks-8.28.0/README.md --- old/gitleaks-8.27.2/README.md 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/README.md 2025-07-20 18:14:25.000000000 +0200 @@ -415,6 +415,94 @@ ### Additional Configuration +#### Composite Rules (Multi-part or `required` Rules) +In v8.28.0 Gitleaks introduced composite rules, which are made up of a single "primary" rule and one or more auxiliary or `required` rules. To create a composite rule, add a `[[rules.required]]` table to the primary rule specifying an `id` and optionally `withinLines` and/or `withinColumns` proximity constraints. A fragment is a chunk of content that Gitleaks processes at once (typically a file, part of a file, or git diff), and proximity matching instructs the primary rule to only report a finding if the auxiliary `required` rules also find matches within the specified area of the fragment. + +**Proximity matching:** Using the `withinLines` and `withinColumns` fields instructs the primary rule to only report a finding if the auxiliary `required` rules also find matches within the specified proximity. You can set: + +- **`withinLines: N`** - required findings must be within N lines (vertically) +- **`withinColumns: N`** - required findings must be within N characters (horizontally) +- **Both** - creates a rectangular search area (both constraints must be satisfied) +- **Neither** - fragment-level matching (required findings can be anywhere in the same fragment) + +Here are diagrams illustrating each proximity behavior: + +``` +p = primary captured secret +a = auxiliary (required) captured secret +fragment = section of data gitleaks is looking at + + + *Fragment-level proximity* + Any required finding in the fragment + ┌────────┐ + ┌──────┤fragment├─────┐ + │ └──────┬─┤ │ ┌───────┐ + │ │a│◀────┼─│✓ MATCH│ + │ ┌─┐└─┘ │ └───────┘ + │┌─┐ │p│ │ + ││a│ ┌─┐└─┘ │ ┌───────┐ + │└─┘ │a│◀──────────┼─│✓ MATCH│ + └─▲─────┴─┴───────────┘ └───────┘ + │ ┌───────┐ + └────│✓ MATCH│ + └───────┘ + + + *Column bounded proximity* + `withinColumns = 3` + ┌────────┐ + ┌────┬─┤fragment├─┬───┐ + │ └──────┬─┤ │ ┌───────────┐ + │ │ │a│◀┼───┼─│+1C ✓ MATCH│ + │ ┌─┐└─┘ │ └───────────┘ + │┌─┐ │ │p│ │ │ +┌──▶│a│ ┌─┐ └─┘ │ ┌───────────┐ +│ │└─┘ ││a│◀────────┼───┼─│-2C ✓ MATCH│ +│ │ ┘ │ └───────────┘ +│ └── -3C ───0C─── +3C ─┘ +│ ┌─────────┐ +│ │ -4C ✗ NO│ +└──│ MATCH │ + └─────────┘ + + + *Line bounded proximity* + `withinLines = 4` + ┌────────┐ + ┌─────┤fragment├─────┐ + +4L─ ─ ┴────────┘─ ─ ─│ + │ │ + │ ┌─┐ │ ┌────────────┐ + │ ┌─┐ │a│◀──┼─│+1L ✓ MATCH │ + 0L ┌─┐ │p│ └─┘ │ ├────────────┤ + │ │a│◀──┴─┴────────┼─│-1L ✓ MATCH │ + │ └─┘ │ └────────────┘ + │ │ ┌─────────┐ + -4L─ ─ ─ ─ ─ ─ ─ ─┌─┐─│ │-5L ✗ NO │ + │ │a│◀┼─│ MATCH │ + └────────────────┴─┴─┘ └─────────┘ + + + *Line and column bounded proximity* + `withinLines = 4` + `withinColumns = 3` + ┌────────┐ + ┌─────┤fragment├─────┐ + +4L ┌└────────┴ ┐ │ + │ ┌─┐ │ ┌───────────────┐ + │ │ │a│◀┼───┼─│+2L/+1C ✓ MATCH│ + │ ┌─┐└─┘ │ └───────────────┘ + 0L │ │p│ │ │ + │ └─┘ │ + │ │ │ │ ┌────────────┐ + -4L ─ ─ ─ ─ ─ ─┌─┐ │ │-5L/+3C ✗ NO│ + │ │a│◀┼─│ MATCH │ + └───-3C────0L───+3C┴─┘ └────────────┘ +``` + +<details><summary>Some final quick thoughts on composite rules.</summary>This is an experimental feature! It's subject to change so don't go sellin' a new B2B SaaS feature built ontop of this feature. Scan type (git vs dir) based context is interesting. I'm monitoring the situation. Composite rules might not be super useful for git scans because gitleaks only looks at additions in the git history. It could be useful to scan non-additions in git history for `required` rules. Oh, right this is a readme, I'll shut up now.</details> + #### gitleaks:allow If you are knowingly committing a test secret that gitleaks will catch you can add a `gitleaks:allow` comment to that line which will instruct gitleaks diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/cmd/generate/config/main.go new/gitleaks-8.28.0/cmd/generate/config/main.go --- old/gitleaks-8.27.2/cmd/generate/config/main.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/cmd/generate/config/main.go 2025-07-20 18:14:25.000000000 +0200 @@ -35,6 +35,10 @@ rules.AlgoliaApiKey(), rules.AlibabaAccessKey(), rules.AlibabaSecretKey(), + rules.AnthropicAdminApiKey(), + rules.AnthropicApiKey(), + rules.ArtifactoryApiKey(), + rules.ArtifactoryReferenceToken(), rules.AsanaClientID(), rules.AsanaClientSecret(), rules.Atlassian(), diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/cmd/generate/config/rules/anthropic.go new/gitleaks-8.28.0/cmd/generate/config/rules/anthropic.go --- old/gitleaks-8.27.2/cmd/generate/config/rules/anthropic.go 1970-01-01 01:00:00.000000000 +0100 +++ new/gitleaks-8.28.0/cmd/generate/config/rules/anthropic.go 2025-07-20 18:14:25.000000000 +0200 @@ -0,0 +1,69 @@ +package rules + +import ( + "github.com/zricethezav/gitleaks/v8/cmd/generate/config/utils" + "github.com/zricethezav/gitleaks/v8/cmd/generate/secrets" + "github.com/zricethezav/gitleaks/v8/config" +) + +func AnthropicApiKey() *config.Rule { + // define rule + r := config.Rule{ + RuleID: "anthropic-api-key", + Description: "Identified an Anthropic API Key, which may compromise AI assistant integrations and expose sensitive data to unauthorized access.", + Regex: utils.GenerateUniqueTokenRegex(`sk-ant-api03-[a-zA-Z0-9_\-]{93}AA`, false), + Keywords: []string{ + "sk-ant-api03", + }, + } + + // validate + tps := []string{ + // Valid API key example + "sk-ant-api03-abc123xyz-456def789ghij-klmnopqrstuvwx-3456yza789bcde-1234fghijklmnopby56aaaogaopaaaabc123xyzAA", + // Generate additional random test keys + utils.GenerateSampleSecret("anthropic", "sk-ant-api03-"+secrets.NewSecret(utils.AlphaNumericExtendedShort("93"))+"AA"), + } + + fps := []string{ + // Too short key (missing characters) + "sk-ant-api03-abc123xyz-456de-klMnopqrstuvwx-3456yza789bcde-1234fghijklmnopAA", + // Wrong suffix + "sk-ant-api03-abc123xyz-456def789ghij-klmnopqrstuvwx-3456yza789bcde-1234fghijklmnopby56aaaogaopaaaabc123xyzBB", + // Wrong prefix (admin key, not API key) + "sk-ant-admin01-abc123xyz-456def789ghij-klmnopqrstuvwx-3456yza789bcde-1234fghijklmnopby56aaaogaopaaaabc123xyzAA", + } + + return utils.Validate(r, tps, fps) +} + +func AnthropicAdminApiKey() *config.Rule { + // define rule + r := config.Rule{ + RuleID: "anthropic-admin-api-key", + Description: "Detected an Anthropic Admin API Key, risking unauthorized access to administrative functions and sensitive AI model configurations.", + Regex: utils.GenerateUniqueTokenRegex(`sk-ant-admin01-[a-zA-Z0-9_\-]{93}AA`, false), + Keywords: []string{ + "sk-ant-admin01", + }, + } + + // validate + tps := []string{ + // Valid admin key example + "sk-ant-admin01-abc12fake-456def789ghij-klmnopqrstuvwx-3456yza789bcde-12fakehijklmnopby56aaaogaopaaaabc123xyzAA", + // Generate additional random test keys + utils.GenerateSampleSecret("anthropic", "sk-ant-admin01-"+secrets.NewSecret(utils.AlphaNumericExtendedShort("93"))+"AA"), + } + + fps := []string{ + // Too short key (missing characters) + "sk-ant-admin01-abc123xyz-456de-klMnopqrstuvwx-3456yza789bcde-1234fghijklmnopAA", + // Wrong suffix + "sk-ant-admin01-abc123xyz-456def789ghij-klmnopqrstuvwx-3456yza789bcde-1234fghijklmnopby56aaaogaopaaaabc123xyzBB", + // Wrong prefix (API key, not admin key) + "sk-ant-api03-abc123xyz-456def789ghij-klmnopqrstuvwx-3456yza789bcde-1234fghijklmnopby56aaaogaopaaaabc123xyzAA", + } + + return utils.Validate(r, tps, fps) +} diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/cmd/generate/config/rules/artifactory.go new/gitleaks-8.28.0/cmd/generate/config/rules/artifactory.go --- old/gitleaks-8.27.2/cmd/generate/config/rules/artifactory.go 1970-01-01 01:00:00.000000000 +0100 +++ new/gitleaks-8.28.0/cmd/generate/config/rules/artifactory.go 2025-07-20 18:14:25.000000000 +0200 @@ -0,0 +1,58 @@ +package rules + +import ( + "github.com/zricethezav/gitleaks/v8/cmd/generate/config/utils" + "github.com/zricethezav/gitleaks/v8/cmd/generate/secrets" + "github.com/zricethezav/gitleaks/v8/config" + "github.com/zricethezav/gitleaks/v8/regexp" +) + +func ArtifactoryApiKey() *config.Rule { + // define rule + r := config.Rule{ + RuleID: "artifactory-api-key", + Description: "Detected an Artifactory api key, posing a risk unauthorized access to the central repository.", + Regex: regexp.MustCompile(`\bAKCp[A-Za-z0-9]{69}\b`), + Entropy: 4.5, + Keywords: []string{"AKCp"}, + } + + // validate + tps := []string{ + "artifactoryApiKey := \"AKCp" + secrets.NewSecret(utils.AlphaNumeric("69")) + "\"", + } + // false positives + fps := []string{ + `lowEntropy := AKCpXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX`, + "wrongStart := \"AkCp" + secrets.NewSecret(utils.AlphaNumeric("69")) + "\"", + "wrongLength := \"AkCp" + secrets.NewSecret(utils.AlphaNumeric("59")) + "\"", + "partOfAlongUnrelatedBlob gYnkgAkCp" + secrets.NewSecret(utils.AlphaNumeric("69")) + "VyZSB2", + } + + return utils.Validate(r, tps, fps) +} + +func ArtifactoryReferenceToken() *config.Rule { + // define rule + r := config.Rule{ + RuleID: "artifactory-reference-token", + Description: "Detected an Artifactory reference token, posing a risk of impersonation and unauthorized access to the central repository.", + Regex: regexp.MustCompile(`\bcmVmd[A-Za-z0-9]{59}\b`), + Entropy: 4.5, + Keywords: []string{"cmVmd"}, + } + + // validate + tps := []string{ + "artifactoryRefToken := \"cmVmd" + secrets.NewSecret(utils.AlphaNumeric("59")) + "\"", + } + // false positives + fps := []string{ + `lowEntropy := cmVmdXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX`, + "wrongStart := \"cmVMd" + secrets.NewSecret(utils.AlphaNumeric("59")) + "\"", + "wrongLength := \"cmVmd" + secrets.NewSecret(utils.AlphaNumeric("49")) + "\"", + "partOfAlongUnrelatedBlob gYnkgcmVmd" + secrets.NewSecret(utils.AlphaNumeric("59")) + "VyZSB2", + } + + return utils.Validate(r, tps, fps) +} diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/cmd/root.go new/gitleaks-8.28.0/cmd/root.go --- old/gitleaks-8.27.2/cmd/root.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/cmd/root.go 2025-07-20 18:14:25.000000000 +0200 @@ -77,7 +77,6 @@ rootCmd.PersistentFlags().StringP("gitleaks-ignore-path", "i", ".", "path to .gitleaksignore file or folder containing one") rootCmd.PersistentFlags().Int("max-decode-depth", 0, "allow recursive decoding up to this depth (default \"0\", no decoding is done)") rootCmd.PersistentFlags().Int("max-archive-depth", 0, "allow scanning into nested archives up to this depth (default \"0\", no archive traversal is done)") - rootCmd.PersistentFlags().BoolP("experimental-optimizations", "", false, "enables experimental allowlist optimizations, increasing performance at the cost of startup time") // Add diagnostics flags rootCmd.PersistentFlags().String("diagnostics", "", "enable diagnostics (http OR comma-separated list: cpu,mem,trace). cpu=CPU prof, mem=memory prof, trace=exec tracing, http=serve via net/http/pprof") @@ -223,11 +222,7 @@ if err := viper.Unmarshal(&vc); err != nil { logging.Fatal().Err(err).Msg("Failed to load config") } - // set experimental feature flag(s) - if mustGetBoolFlag(cmd, "experimental-optimizations") { - logging.Warn().Msgf("using experimental allowlist optimizations, updates may contain breaking changes!") - vc.EnableExperimentalAllowlistOptimizations = true - } + cfg, err := vc.Translate() if err != nil { logging.Fatal().Err(err).Msg("Failed to load config") diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/config/allowlist.go new/gitleaks-8.28.0/config/allowlist.go --- old/gitleaks-8.27.2/config/allowlist.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/config/allowlist.go 2025-07-20 18:14:25.000000000 +0200 @@ -59,11 +59,6 @@ // validated is an internal flag to track whether `Validate()` has been called. validated bool - // EnableExperimentalOptimizations must be set prior to calling `Validate()`. - // See: https://github.com/gitleaks/gitleaks/pull/1731 - // - // NOTE: This flag may be removed in the future. - EnableExperimentalOptimizations bool // commitMap is a normalized version of Commits, used for efficiency purposes. // TODO: possible optimizations so that both short and long hashes work. commitMap map[string]struct{} @@ -92,11 +87,8 @@ // Commits are case-insensitive. uniqueCommits[strings.TrimSpace(strings.ToLower(commit))] = struct{}{} } - if a.EnableExperimentalOptimizations { - a.commitMap = uniqueCommits - } else { - a.Commits = maps.Keys(uniqueCommits) - } + a.Commits = maps.Keys(uniqueCommits) + a.commitMap = uniqueCommits } if len(a.StopWords) > 0 { uniqueStopwords := make(map[string]struct{}) @@ -105,21 +97,16 @@ } values := maps.Keys(uniqueStopwords) - if a.EnableExperimentalOptimizations { - a.stopwordTrie = ahocorasick.NewTrieBuilder().AddStrings(values).Build() - } else { - a.StopWords = values - } + a.StopWords = values + a.stopwordTrie = ahocorasick.NewTrieBuilder().AddStrings(values).Build() } // Combine patterns into a single expression. - if a.EnableExperimentalOptimizations { - if len(a.Paths) > 0 { - a.pathPat = joinRegexOr(a.Paths) - } - if len(a.Regexes) > 0 { - a.regexPat = joinRegexOr(a.Regexes) - } + if len(a.Paths) > 0 { + a.pathPat = joinRegexOr(a.Paths) + } + if len(a.Regexes) > 0 { + a.regexPat = joinRegexOr(a.Regexes) } a.validated = true diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/config/allowlist_test.go new/gitleaks-8.28.0/config/allowlist_test.go --- old/gitleaks-8.27.2/config/allowlist_test.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/config/allowlist_test.go 2025-07-20 18:14:25.000000000 +0200 @@ -151,7 +151,7 @@ cmpopts.IgnoreUnexported(Allowlist{}), } ) - if diff := cmp.Diff(tt.input, tt.expected, opts); diff != "" { + if diff := cmp.Diff(tt.expected, tt.input, opts); diff != "" { t.Errorf("diff: (-want +got)\n%s", diff) } } diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/config/config.go new/gitleaks-8.28.0/config/config.go --- old/gitleaks-8.27.2/config/config.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/config/config.go 2025-07-20 18:14:25.000000000 +0200 @@ -43,19 +43,23 @@ // Deprecated: this is a shim for backwards-compatibility. // TODO: Remove this in 9.x. - AllowList *viperRuleAllowlist + AllowList *viperRuleAllowlist + Allowlists []*viperRuleAllowlist + Required []*viperRequired + SkipReport bool } // Deprecated: this is a shim for backwards-compatibility. // TODO: Remove this in 9.x. - AllowList *viperGlobalAllowlist + AllowList *viperGlobalAllowlist + Allowlists []*viperGlobalAllowlist +} - // EnableExperimentalAllowlistOptimizations enables a preview feature. - // See: https://github.com/gitleaks/gitleaks/pull/1731 - // - // NOTE: This flag may be removed in the future. - EnableExperimentalAllowlistOptimizations bool +type viperRequired struct { + ID string + WithinLines *int `mapstructure:"withinLines"` + WithinColumns *int `mapstructure:"withinColumns"` } type viperRuleAllowlist struct { @@ -136,6 +140,7 @@ Path: pathPat, Keywords: vr.Keywords, Tags: vr.Tags, + SkipReport: vr.SkipReport, } // Parse the rule allowlists, including the older format for backwards compatibility. @@ -153,10 +158,34 @@ } cr.Allowlists = append(cr.Allowlists, allowlist) } + + for _, r := range vr.Required { + if r.ID == "" { + return Config{}, fmt.Errorf("%s: [[rules.required]] rule ID is empty", cr.RuleID) + } + requiredRule := Required{ + RuleID: r.ID, + WithinLines: r.WithinLines, + WithinColumns: r.WithinColumns, + // Distance: r.Distance, + } + cr.RequiredRules = append(cr.RequiredRules, &requiredRule) + } + orderedRules = append(orderedRules, cr.RuleID) rulesMap[cr.RuleID] = cr } + // after all the rules have been processed, let's ensure the required rules + // actually exist. + for _, r := range rulesMap { + for _, rr := range r.RequiredRules { + if _, ok := rulesMap[rr.RuleID]; !ok { + return Config{}, fmt.Errorf("%s: [[rules.required]] rule ID '%s' does not exist", r.RuleID, rr.RuleID) + } + } + } + // Assemble the config. c := Config{ Title: vc.Title, @@ -261,14 +290,13 @@ } allowlist := &Allowlist{ - Description: a.Description, - MatchCondition: matchCondition, - Commits: a.Commits, - Paths: allowlistPaths, - RegexTarget: regexTarget, - Regexes: allowlistRegexes, - StopWords: a.StopWords, - EnableExperimentalOptimizations: vc.EnableExperimentalAllowlistOptimizations, + Description: a.Description, + MatchCondition: matchCondition, + Commits: a.Commits, + Paths: allowlistPaths, + RegexTarget: regexTarget, + Regexes: allowlistRegexes, + StopWords: a.StopWords, } if err := allowlist.Validate(); err != nil { return nil, err @@ -292,9 +320,7 @@ if err := viper.ReadConfig(strings.NewReader(DefaultConfig)); err != nil { return fmt.Errorf("failed to load extended default config, err: %w", err) } - defaultViperConfig := ViperConfig{ - EnableExperimentalAllowlistOptimizations: parent.EnableExperimentalAllowlistOptimizations, - } + defaultViperConfig := ViperConfig{} if err := viper.Unmarshal(&defaultViperConfig); err != nil { return fmt.Errorf("failed to load extended default config, err: %w", err) } @@ -314,9 +340,7 @@ if err := viper.ReadInConfig(); err != nil { return fmt.Errorf("failed to load extended config, err: %w", err) } - extensionViperConfig := ViperConfig{ - EnableExperimentalAllowlistOptimizations: parent.EnableExperimentalAllowlistOptimizations, - } + extensionViperConfig := ViperConfig{} if err := viper.Unmarshal(&extensionViperConfig); err != nil { return fmt.Errorf("failed to load extended config, err: %w", err) } diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/config/gitleaks.toml new/gitleaks-8.28.0/config/gitleaks.toml --- old/gitleaks-8.27.2/config/gitleaks.toml 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/config/gitleaks.toml 2025-07-20 18:14:25.000000000 +0200 @@ -130,6 +130,32 @@ keywords = ["alibaba"] [[rules]] +id = "anthropic-admin-api-key" +description = "Detected an Anthropic Admin API Key, risking unauthorized access to administrative functions and sensitive AI model configurations." +regex = '''\b(sk-ant-admin01-[a-zA-Z0-9_\-]{93}AA)(?:[\x60'"\s;]|\\[nr]|$)''' +keywords = ["sk-ant-admin01"] + +[[rules]] +id = "anthropic-api-key" +description = "Identified an Anthropic API Key, which may compromise AI assistant integrations and expose sensitive data to unauthorized access." +regex = '''\b(sk-ant-api03-[a-zA-Z0-9_\-]{93}AA)(?:[\x60'"\s;]|\\[nr]|$)''' +keywords = ["sk-ant-api03"] + +[[rules]] +id = "artifactory-api-key" +description = "Detected an Artifactory api key, posing a risk unauthorized access to the central repository." +regex = '''\bAKCp[A-Za-z0-9]{69}\b''' +entropy = 4.5 +keywords = ["akcp"] + +[[rules]] +id = "artifactory-reference-token" +description = "Detected an Artifactory reference token, posing a risk of impersonation and unauthorized access to the central repository." +regex = '''\bcmVmd[A-Za-z0-9]{59}\b''' +entropy = 4.5 +keywords = ["cmvmd"] + +[[rules]] id = "asana-client-id" description = "Discovered a potential Asana Client ID, risking unauthorized access to Asana projects and sensitive task information." regex = '''(?i)[\w.-]{0,50}?(?:asana)(?:[ \t\w.-]{0,20})[\s'"]{0,3}(?:=|>|:{1,3}=|\|\||:|=>|\?=|,)[\x60'"\s=]{0,5}([0-9]{16})(?:[\x60'"\s;]|\\[nr]|$)''' diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/config/rule.go new/gitleaks-8.28.0/config/rule.go --- old/gitleaks-8.27.2/config/rule.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/config/rule.go 2025-07-20 18:14:25.000000000 +0200 @@ -46,6 +46,18 @@ // validated is an internal flag to track whether `Validate()` has been called. validated bool + + // If a rule has RequiredRules, it makes the rule dependent on the RequiredRules. + // In otherwords, this rule is now a composite rule. + RequiredRules []*Required + + SkipReport bool +} + +type Required struct { + RuleID string + WithinLines *int + WithinColumns *int } // Validate guards against common misconfigurations. diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/detect/detect.go new/gitleaks-8.28.0/detect/detect.go --- old/gitleaks-8.27.2/detect/detect.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/detect/detect.go 2025-07-20 18:14:25.000000000 +0200 @@ -292,9 +292,6 @@ return findings } - // add newline indices for location calculation in detectRule - newlineIndices := newLineRegexp.FindAllStringIndex(fragment.Raw, -1) - // setup variables to handle different decoding passes currentRaw := fragment.Raw encodedSegments := []*codec.EncodedSegment{} @@ -314,14 +311,14 @@ if len(rule.Keywords) == 0 { // if no keywords are associated with the rule always scan the // fragment using the rule - findings = append(findings, d.detectRule(fragment, newlineIndices, currentRaw, rule, encodedSegments)...) + findings = append(findings, d.detectRule(fragment, currentRaw, rule, encodedSegments)...) continue } // check if keywords are in the fragment for _, k := range rule.Keywords { if _, ok := keywords[strings.ToLower(k)]; ok { - findings = append(findings, d.detectRule(fragment, newlineIndices, currentRaw, rule, encodedSegments)...) + findings = append(findings, d.detectRule(fragment, currentRaw, rule, encodedSegments)...) break } } @@ -348,7 +345,7 @@ } // detectRule scans the given fragment for the given rule and returns a list of findings -func (d *Detector) detectRule(fragment Fragment, newlineIndices [][]int, currentRaw string, r config.Rule, encodedSegments []*codec.EncodedSegment) []report.Finding { +func (d *Detector) detectRule(fragment Fragment, currentRaw string, r config.Rule, encodedSegments []*codec.EncodedSegment) []report.Finding { var ( findings []report.Finding logger = func() zerolog.Logger { @@ -360,6 +357,10 @@ }() ) + if r.SkipReport == true && !fragment.InheritedFromFinding { + return findings + } + // check if commit or file is allowed for this rule. if isAllowed, event := checkCommitOrPathAllowed(logger, fragment, r.Allowlists); isAllowed { event.Msg("skipping file: rule allowlist") @@ -415,6 +416,14 @@ } } + matches := r.Regex.FindAllStringIndex(currentRaw, -1) + if len(matches) == 0 { + return findings + } + + // TODO profile this, probably should replace with something more efficient + newlineIndices := newLineRegexp.FindAllStringIndex(fragment.Raw, -1) + // use currentRaw instead of fragment.Raw since this represents the current // decoding pass on the text for _, matchIndex := range r.Regex.FindAllStringIndex(currentRaw, -1) { @@ -534,7 +543,145 @@ } findings = append(findings, finding) } - return findings + + // Handle required rules (multi-part rules) + if fragment.InheritedFromFinding || len(r.RequiredRules) == 0 { + return findings + } + + // Process required rules and create findings with auxiliary findings + return d.processRequiredRules(fragment, currentRaw, r, encodedSegments, findings, logger) +} + +// processRequiredRules handles the logic for multi-part rules with auxiliary findings +func (d *Detector) processRequiredRules(fragment Fragment, currentRaw string, r config.Rule, encodedSegments []*codec.EncodedSegment, primaryFindings []report.Finding, logger zerolog.Logger) []report.Finding { + if len(primaryFindings) == 0 { + logger.Debug().Msg("no primary findings to process for required rules") + return primaryFindings + } + + // Pre-collect all required rule findings once + allRequiredFindings := make(map[string][]report.Finding) + + for _, requiredRule := range r.RequiredRules { + rule, ok := d.Config.Rules[requiredRule.RuleID] + if !ok { + logger.Error().Str("rule-id", requiredRule.RuleID).Msg("required rule not found in config") + continue + } + + // Mark fragment as inherited to prevent infinite recursion + inheritedFragment := fragment + inheritedFragment.InheritedFromFinding = true + + // Call detectRule once for each required rule + requiredFindings := d.detectRule(inheritedFragment, currentRaw, rule, encodedSegments) + allRequiredFindings[requiredRule.RuleID] = requiredFindings + + logger.Debug(). + Str("rule-id", requiredRule.RuleID). + Int("findings", len(requiredFindings)). + Msg("collected required rule findings") + } + + var finalFindings []report.Finding + + // Now process each primary finding against the pre-collected required findings + for _, primaryFinding := range primaryFindings { + var requiredFindings []*report.RequiredFinding + + for _, requiredRule := range r.RequiredRules { + foundRequiredFindings, exists := allRequiredFindings[requiredRule.RuleID] + if !exists { + continue // Rule wasn't found earlier, skip + } + + // Filter findings that are within proximity of the primary finding + for _, requiredFinding := range foundRequiredFindings { + if d.withinProximity(primaryFinding, requiredFinding, requiredRule) { + req := &report.RequiredFinding{ + RuleID: requiredFinding.RuleID, + StartLine: requiredFinding.StartLine, + EndLine: requiredFinding.EndLine, + StartColumn: requiredFinding.StartColumn, + EndColumn: requiredFinding.EndColumn, + Line: requiredFinding.Line, + Match: requiredFinding.Match, + Secret: requiredFinding.Secret, + } + requiredFindings = append(requiredFindings, req) + } + } + } + + // Check if we have at least one auxiliary finding for each required rule + if len(requiredFindings) > 0 && d.hasAllRequiredRules(requiredFindings, r.RequiredRules) { + // Create a finding with auxiliary findings + newFinding := primaryFinding // Copy the primary finding + newFinding.AddRequiredFindings(requiredFindings) + finalFindings = append(finalFindings, newFinding) + + logger.Debug(). + Str("primary-rule", r.RuleID). + Int("primary-line", primaryFinding.StartLine). + Int("auxiliary-count", len(requiredFindings)). + Msg("multi-part rule satisfied") + } + } + + return finalFindings +} + +// hasAllRequiredRules checks if we have at least one auxiliary finding for each required rule +func (d *Detector) hasAllRequiredRules(auxiliaryFindings []*report.RequiredFinding, requiredRules []*config.Required) bool { + foundRules := make(map[string]bool) + // AuxiliaryFinding + for _, aux := range auxiliaryFindings { + foundRules[aux.RuleID] = true + } + + for _, required := range requiredRules { + if !foundRules[required.RuleID] { + return false + } + } + + return true +} + +func (d *Detector) withinProximity(primary, required report.Finding, requiredRule *config.Required) bool { + // fmt.Println(requiredRule.WithinLines) + // If neither within_lines nor within_columns is set, findings just need to be in the same fragment + if requiredRule.WithinLines == nil && requiredRule.WithinColumns == nil { + return true + } + + // Check line proximity (vertical distance) + if requiredRule.WithinLines != nil { + lineDiff := abs(primary.StartLine - required.StartLine) + if lineDiff > *requiredRule.WithinLines { + return false + } + } + + // Check column proximity (horizontal distance) + if requiredRule.WithinColumns != nil { + // Use the start column of each finding for proximity calculation + colDiff := abs(primary.StartColumn - required.StartColumn) + if colDiff > *requiredRule.WithinColumns { + return false + } + } + + return true +} + +// abs returns the absolute value of an integer +func abs(x int) int { + if x < 0 { + return -x + } + return x } // AddFinding synchronously adds a finding to the findings slice diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/detect/detect_test.go new/gitleaks-8.28.0/detect/detect_test.go --- old/gitleaks-8.27.2/detect/detect_test.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/detect/detect_test.go 2025-07-20 18:14:25.000000000 +0200 @@ -1,8 +1,10 @@ package detect import ( + "bytes" "context" "fmt" + "io" "os" "path/filepath" "runtime" @@ -10,6 +12,7 @@ "testing" "github.com/google/go-cmp/cmp" + "github.com/google/go-cmp/cmp/cmpopts" "github.com/rs/zerolog" "github.com/spf13/viper" "github.com/stretchr/testify/assert" @@ -86,6 +89,46 @@ secret%3D%25%35%61%25%34%37%25%35%36jb2RlZC1zZWNyZXQtdmFsdWU%25%32%35%25%33%33%25%36%34 ` +var multili = ` +username = "admin" + + + + password = "secret123" +` + +func compare(t *testing.T, a, b []report.Finding) { + if diff := cmp.Diff(a, b, + cmpopts.SortSlices(func(a, b report.Finding) bool { + if a.File != b.File { + return a.File < b.File + } + if a.StartLine != b.StartLine { + return a.StartLine < b.StartLine + } + if a.StartColumn != b.StartColumn { + return a.StartColumn < b.StartColumn + } + if a.EndLine != b.EndLine { + return a.EndLine < b.EndLine + } + if a.EndColumn != b.EndColumn { + return a.EndColumn < b.EndColumn + } + if a.RuleID != b.RuleID { + return a.RuleID < b.RuleID + } + return a.Secret < b.Secret + }), + cmpopts.IgnoreFields(report.Finding{}, + "Fingerprint", "Author", "Email", "Date", "Message", "Commit", "requiredFindings"), + cmpopts.EquateApprox(0.0001, 0), // For floating point Entropy comparison + ); diff != "" { + t.Errorf("findings mismatch (-want +got):\n%s", diff) + } + +} + func TestDetect(t *testing.T) { logging.Logger = logging.Logger.Level(zerolog.TraceLevel) tests := map[string]struct { @@ -97,8 +140,9 @@ // I.e., if the finding is from a --no-git file, the line number will be // increase by 1 in DetectFromFiles(). If the finding is from git, // the line number will be increased by the patch delta. - expectedFindings []report.Finding - wantError error + expectedFindings []report.Finding + wantError error + expectedAuxOutput string }{ // General "valid allow comment (1)": { @@ -424,6 +468,28 @@ FilePath: "tmp.go", }, }, + "fragment level composite": { + cfgName: "composite", + fragment: Fragment{ + Raw: multili, + }, + expectedFindings: []report.Finding{ + { + Description: "Primary rule", + RuleID: "primary-rule", + StartLine: 5, + EndLine: 5, + StartColumn: 5, + EndColumn: 26, + Line: "\n\t\t\tpassword = \"secret123\"", + Match: `password = "secret123"`, + Secret: "secret123", + Entropy: 2.9477028846740723, + Tags: []string{}, + }, + }, + expectedAuxOutput: "Required: username-rule:1:admin\n", + }, // Decoding "detect encoded": { cfgName: "encoded", @@ -736,11 +802,50 @@ d.baselinePath = tt.baselinePath findings := d.Detect(tt.fragment) - assert.ElementsMatch(t, tt.expectedFindings, findings) + + compare(t, findings, tt.expectedFindings) + + // extremely goofy way to test auxiliary findings + // capture stdout and print that sonabitch + // TODO + if tt.expectedAuxOutput != "" { + capturedOutput := captureStdout(func() { + for _, finding := range findings { + finding.PrintRequiredFindings() + } + }) + + // Clean up the output for comparison (remove ANSI color codes) + cleanOutput := stripANSI(capturedOutput) + expectedClean := stripANSI(tt.expectedAuxOutput) + + assert.Equal(t, expectedClean, cleanOutput, "Auxiliary output should match") + } + }) } } +func stripANSI(s string) string { + ansiRegex := regexp.MustCompile(`\x1b\[[0-9;]*m`) + return ansiRegex.ReplaceAllString(s, "") +} + +func captureStdout(f func()) string { + oldStdout := os.Stdout + r, w, _ := os.Pipe() + os.Stdout = w + + f() + + w.Close() + os.Stdout = oldStdout + + var buf bytes.Buffer + io.Copy(&buf, r) + return buf.String() +} + // TestFromGit tests the FromGit function func TestFromGit(t *testing.T) { // TODO: Fix this test on windows. @@ -2107,8 +2212,10 @@ }, expected: []report.Finding{ { - StartColumn: 50, - EndColumn: 60, + StartLine: 1, + EndLine: 1, + StartColumn: 18, + EndColumn: 28, Line: "let username = 'ja...@mail.com';\nlet password = 'Summer2024!';", Match: "Summer2024!", Secret: "Summer2024!", @@ -2132,8 +2239,10 @@ }, expected: []report.Finding{ { - StartColumn: 50, - EndColumn: 60, + StartLine: 1, + EndLine: 1, + StartColumn: 18, + EndColumn: 28, Line: "let username = 'ja...@mail.com';\nlet password = 'Summer2024!';", Match: "Summer2024!", Secret: "Summer2024!", @@ -2203,8 +2312,10 @@ }, expected: []report.Finding{ { - StartColumn: 50, - EndColumn: 60, + StartLine: 1, + EndLine: 1, + StartColumn: 18, + EndColumn: 28, Line: "let username = 'ja...@mail.com';\nlet password = 'Summer2024!';", Match: "Summer2024!", Secret: "Summer2024!", @@ -2225,8 +2336,10 @@ }, expected: []report.Finding{ { - StartColumn: 50, - EndColumn: 60, + StartLine: 1, + EndLine: 1, + StartColumn: 18, + EndColumn: 28, Line: "let username = 'ja...@mail.com';\nlet password = 'Summer2024!';", Match: "Summer2024!", Secret: "Summer2024!", @@ -2249,8 +2362,10 @@ }, expected: []report.Finding{ { - StartColumn: 50, - EndColumn: 60, + StartLine: 1, + EndLine: 1, + StartColumn: 18, + EndColumn: 28, Line: "let username = 'ja...@mail.com';\nlet password = 'Summer2024!';", Match: "Summer2024!", Secret: "Summer2024!", @@ -2290,10 +2405,9 @@ f := tc.fragment f.Raw = raw - actual := d.detectRule(f, [][]int{}, raw, rule, []*codec.EncodedSegment{}) - if diff := cmp.Diff(tc.expected, actual); diff != "" { - t.Errorf("diff: (-want +got)\n%s", diff) - } + + actual := d.detectRule(f, raw, rule, []*codec.EncodedSegment{}) + compare(t, tc.expected, actual) }) } } @@ -2451,10 +2565,8 @@ require.NoError(t, err) for name, test := range tests { t.Run(name, func(t *testing.T) { - actual := d.detectRule(test.fragment, [][]int{}, test.fragment.Raw, test.rule, []*codec.EncodedSegment{}) - if diff := cmp.Diff(test.expected, actual); diff != "" { - t.Errorf("diff: (-want +got)\n%s", diff) - } + actual := d.detectRule(test.fragment, test.fragment.Raw, test.rule, []*codec.EncodedSegment{}) + compare(t, test.expected, actual) }) } } @@ -2637,12 +2749,8 @@ require.NoError(t, err) for name, test := range tests { t.Run(name, func(t *testing.T) { - actual := d.detectRule(test.fragment, [][]int{}, test.fragment.Raw, test.rule, []*codec.EncodedSegment{}) - if diff := cmp.Diff(test.expected, actual); diff != "" { - t.Errorf("diff: (-want +got)\n%s", diff) - } + actual := d.detectRule(test.fragment, test.fragment.Raw, test.rule, []*codec.EncodedSegment{}) + compare(t, test.expected, actual) }) } } - -//endregion diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/detect/utils.go new/gitleaks-8.28.0/detect/utils.go --- old/gitleaks-8.27.2/detect/utils.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/detect/utils.go 2025-07-20 18:14:25.000000000 +0200 @@ -232,7 +232,9 @@ fmt.Printf("%-12s %s\n", "RuleID:", f.RuleID) fmt.Printf("%-12s %f\n", "Entropy:", f.Entropy) + if f.File == "" { + f.PrintRequiredFindings() fmt.Println("") return } @@ -243,6 +245,7 @@ fmt.Printf("%-12s %d\n", "Line:", f.StartLine) if f.Commit == "" { fmt.Printf("%-12s %s\n", "Fingerprint:", f.Fingerprint) + f.PrintRequiredFindings() fmt.Println("") return } @@ -254,5 +257,7 @@ if f.Link != "" { fmt.Printf("%-12s %s\n", "Link:", f.Link) } + + f.PrintRequiredFindings() fmt.Println("") } diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/go.mod new/gitleaks-8.28.0/go.mod --- old/gitleaks-8.27.2/go.mod 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/go.mod 2025-07-20 18:14:25.000000000 +0200 @@ -1,8 +1,6 @@ module github.com/zricethezav/gitleaks/v8 -go 1.23.0 - -toolchain go1.23.4 +go 1.23.8 require ( github.com/BobuSumisu/aho-corasick v1.0.3 @@ -62,7 +60,7 @@ github.com/wasilibs/wazero-helpers v0.0.0-20240620070341-3dff1577cd52 // indirect go.uber.org/multierr v1.11.0 // indirect go4.org v0.0.0-20230225012048-214862532bf5 // indirect - golang.org/x/crypto v0.32.0 // indirect + golang.org/x/crypto v0.35.0 // indirect ) require ( diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/go.sum new/gitleaks-8.28.0/go.sum --- old/gitleaks-8.27.2/go.sum 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/go.sum 2025-07-20 18:14:25.000000000 +0200 @@ -245,8 +245,8 @@ golang.org/x/crypto v0.0.0-20190605123033-f99c8df09eb5/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI= golang.org/x/crypto v0.0.0-20191011191535-87dc89f01550/go.mod h1:yigFU9vqHzYiE8UmvKecakEJjdnWj3jj499lnFckfCI= golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc= -golang.org/x/crypto v0.32.0 h1:euUpcYgM8WcP71gNpTqQCn6rC2t6ULUPiOzfWaXVVfc= -golang.org/x/crypto v0.32.0/go.mod h1:ZnnJkOaASj8g0AjIduWNlq2NRxL0PlBrbKVyZ6V/Ugc= +golang.org/x/crypto v0.35.0 h1:b15kiHdrGCHrP6LvwaQ3c03kgNhhiMgvlhxHQhmg2Xs= +golang.org/x/crypto v0.35.0/go.mod h1:dy7dXNW32cAb/6/PRuTNsix8T+vJAqvuIy5Bli/x0YQ= golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA= golang.org/x/exp v0.0.0-20190306152737-a1d7652674e8/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA= golang.org/x/exp v0.0.0-20190510132918-efd6b22b2522/go.mod h1:ZjyILWgesfNpC6sMxTJOJm9Kp84zZh5NQWvqDGG3Qr8= diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/report/finding.go new/gitleaks-8.28.0/report/finding.go --- old/gitleaks-8.27.2/report/finding.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/report/finding.go 2025-07-20 18:14:25.000000000 +0200 @@ -1,12 +1,16 @@ package report import ( + "fmt" "math" "strings" + + "github.com/charmbracelet/lipgloss" + "github.com/zricethezav/gitleaks/v8/sources" ) -// Finding contains information about strings that -// have been captured by a tree-sitter query. +// Finding contains a whole bunch of information about a secret finding. +// Plenty of real estate in this bad boy so fillerup as needed. type Finding struct { // Rule is the name of the rule that was matched RuleID string @@ -21,8 +25,7 @@ Match string - // Secret contains the full content of what is matched in - // the tree-sitter query. + // Captured secret Secret string // File is the name of the file containing the finding @@ -42,6 +45,33 @@ // unique identifier Fingerprint string + + // Fragment used for multi-part rule checking, CEL filtering, + // and eventually ML validation + Fragment *sources.Fragment `json:",omitempty"` + + // TODO keeping private for now to during experimental phase + requiredFindings []*RequiredFinding +} + +type RequiredFinding struct { + // contains a subset of the Finding fields + // only used for reporting + RuleID string + StartLine int + EndLine int + StartColumn int + EndColumn int + Line string `json:"-"` + Match string + Secret string +} + +func (f *Finding) AddRequiredFindings(afs []*RequiredFinding) { + if f.requiredFindings == nil { + f.requiredFindings = make([]*RequiredFinding, 0) + } + f.requiredFindings = append(f.requiredFindings, afs...) } // Redact removes sensitive information from a finding. @@ -68,3 +98,29 @@ return secret[:lth] + "..." } + +func (f *Finding) PrintRequiredFindings() { + if len(f.requiredFindings) == 0 { + return + } + + fmt.Printf("%-12s ", "Required:") + + // Create orange style for secrets + orangeStyle := lipgloss.NewStyle().Foreground(lipgloss.Color("#bf9478")) + + for i, aux := range f.requiredFindings { + auxSecret := strings.TrimSpace(aux.Secret) + // Truncate long secrets for readability + if len(auxSecret) > 40 { + auxSecret = auxSecret[:37] + "..." + } + + // Format: rule-id:line:secret + if i == 0 { + fmt.Printf("%s:%d:%s\n", aux.RuleID, aux.StartLine, orangeStyle.Render(auxSecret)) + } else { + fmt.Printf("%-12s %s:%d:%s\n", "", aux.RuleID, aux.StartLine, orangeStyle.Render(auxSecret)) + } + } +} diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/sources/fragment.go new/gitleaks-8.28.0/sources/fragment.go --- old/gitleaks-8.27.2/sources/fragment.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/sources/fragment.go 2025-07-20 18:14:25.000000000 +0200 @@ -23,4 +23,6 @@ // CommitInfo captures additional information about the git commit if applicable CommitInfo *CommitInfo + + InheritedFromFinding bool // Indicates if this fragment is inherited from a finding } diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/sources/git.go new/gitleaks-8.28.0/sources/git.go --- old/gitleaks-8.27.2/sources/git.go 2025-06-09 02:31:25.000000000 +0200 +++ new/gitleaks-8.28.0/sources/git.go 2025-07-20 18:14:25.000000000 +0200 @@ -440,7 +440,7 @@ } } -var sshUrlpat = regexp.MustCompile(`^git@([a-zA-Z0-9.-]+):([\w/.-]+?)(?:\.git)?$`) +var sshUrlpat = regexp.MustCompile(`^git@([a-zA-Z0-9.-]+):(?:\d{1,5}/)?([\w/.-]+?)(?:\.git)?$`) func getRemoteUrl(source string) (*url.URL, error) { // This will return the first remote — typically, "origin". diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/gitleaks-8.27.2/testdata/config/composite.toml new/gitleaks-8.28.0/testdata/config/composite.toml --- old/gitleaks-8.27.2/testdata/config/composite.toml 1970-01-01 01:00:00.000000000 +0100 +++ new/gitleaks-8.28.0/testdata/config/composite.toml 2025-07-20 18:14:25.000000000 +0200 @@ -0,0 +1,14 @@ +title = "Fragment level composite rule" + +[[rules]] +id = "primary-rule" +description = "Primary rule" +regex = 'password\s*=\s*"([^"]+)"' +[[rules.required]] +id = "username-rule" + +[[rules]] +id = "username-rule" +description = "Username rule" +regex = 'username\s*=\s*"([^"]+)"' +skipReport = true \ No newline at end of file ++++++ gitleaks.obsinfo ++++++ --- /var/tmp/diff_new_pack.7wiTow/_old 2025-07-24 18:54:38.620406385 +0200 +++ /var/tmp/diff_new_pack.7wiTow/_new 2025-07-24 18:54:38.624406550 +0200 @@ -1,5 +1,5 @@ name: gitleaks -version: 8.27.2 -mtime: 1749429085 -commit: c7acf33d962e8effc070072f993c365af19e3661 +version: 8.28.0 +mtime: 1753028065 +commit: 4fb43823ef3d152d239e92d7d5cb04783b548062 ++++++ vendor.tar.gz ++++++ diff -urN '--exclude=CVS' '--exclude=.cvsignore' '--exclude=.svn' '--exclude=.svnignore' old/vendor/modules.txt new/vendor/modules.txt --- old/vendor/modules.txt 2025-06-09 02:31:25.000000000 +0200 +++ new/vendor/modules.txt 2025-07-20 18:14:25.000000000 +0200 @@ -319,8 +319,8 @@ ## explicit; go 1.13 go4.org/readerutil go4.org/syncutil -# golang.org/x/crypto v0.32.0 -## explicit; go 1.20 +# golang.org/x/crypto v0.35.0 +## explicit; go 1.23.0 golang.org/x/crypto/bcrypt golang.org/x/crypto/blowfish golang.org/x/crypto/pbkdf2