dependabot[bot] opened a new pull request, #1747:
URL: https://github.com/apache/stormcrawler/pull/1747

   Bumps 
[com.github.crawler-commons:crawler-commons](https://github.com/crawler-commons/crawler-commons)
 from 1.5 to 1.6.
   <details>
   <summary>Release notes</summary>
   <p><em>Sourced from <a 
href="https://github.com/crawler-commons/crawler-commons/releases";>com.github.crawler-commons:crawler-commons's
 releases</a>.</em></p>
   <blockquote>
   <h2>crawler-commons-1.6</h2>
   <h2>Important Changes</h2>
   <ul>
   <li>This release adds support for IDN2008 domain names and public suffixes 
in EffectiveTldFinder. If you rely on a recent version of the public suffix 
list, please upgrade to release 1.6! See [issue report <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/551";>#551</a>](<a
 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/551";>crawler-commons/crawler-commons#551</a>)
 for more information.</li>
   </ul>
   <h2>Full List of Changes</h2>
   <ul>
   <li>Support IDNA2008 Unicode domains by using ALLOW_UNASSIGNED in IDN 
methods within TldFinder / Normalizer. (Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/551";>#551</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/552";>#552</a></li>
   <li>Add URLUtils class for URL resolution functionality (HamzaElzarw-2022, 
Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/526";>#526</a></li>
   <li>Replace deprecated URL constructor in BasicURLNormalizer 
(HamzaElzarw-2022, Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/531";>#531</a></li>
   <li>Add matchedWildcard flag to BaseRobotRules (CoGiang, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/530";>#530</a></li>
   <li>[Domains] Update unit tests after change in public suffix list 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/544";>#544</a></li>
   <li>[Domains] Replace deleted *.uberspace.de with a wildcard from Google 
(Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/532";>#532</a></li>
   <li>Partial replacement of deprecated URL constructors (HamzaElzarw-2022, 
kkrugler, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/522";>#522</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/524";>#524</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/545";>#545</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/536";>#536</a></li>
   <li>Upgrade dependencies (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/521";>#521</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/533";>#533</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/534";>#534</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/547";>#547</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/549";>#549</a>,</li>
   <li>Upgrade Maven plugins (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/520";>#520</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/538";>#538</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/539";>#539</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/540";>#540</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/546";>#546</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/550";>#550</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/553";>#553</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/554";>#554</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/555";>#555</a></li>
   </ul>
   <h2>New Contributors</h2>
   <ul>
   <li><a 
href="https://github.com/HamzaElzarw-2022";><code>@​HamzaElzarw-2022</code></a> 
made their first contributions in <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/524";>#524</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/526";>#526</a>
 and <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/531";>#531</a></li>
   <li><a href="https://github.com/CoGiang";><code>@​CoGiang</code></a> made 
their first contribution in <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/530";>#530</a></li>
   </ul>
   </blockquote>
   </details>
   <details>
   <summary>Changelog</summary>
   <p><em>Sourced from <a 
href="https://github.com/crawler-commons/crawler-commons/blob/master/CHANGES.txt";>com.github.crawler-commons:crawler-commons's
 changelog</a>.</em></p>
   <blockquote>
   <p>Crawler-Commons Change Log</p>
   <p>Current Development 1.7-SNAPSHOT (yyyy-mm-dd)</p>
   <p>Release 1.6 (2025-12-04)</p>
   <ul>
   <li>Support IDNA2008 Unicode domains by using ALLOW_UNASSIGNED in IDN 
methods within TldFinder / Normalizer. (Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/551";>#551</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/552";>#552</a></li>
   <li>Add URLUtils class for URL resolution functionality (HamzaElzarw-2022, 
Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/526";>#526</a></li>
   <li>Replace deprecated URL constructor in BasicURLNormalizer 
(HamzaElzarw-2022, Richard Zowalla, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/531";>#531</a></li>
   <li>Add matchedWildcard flag to BaseRobotRules (CoGiang, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/530";>#530</a></li>
   <li>[Domains] Update unit tests after change in public suffix list 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/544";>#544</a></li>
   <li>[Domains] Replace deleted *.uberspace.de with a wildcard from Google 
(Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/532";>#532</a></li>
   <li>Partial replacement of deprecated URL constructors (HamzaElzarw-2022, 
kkrugler, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/522";>#522</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/524";>#524</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/545";>#545</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/536";>#536</a></li>
   <li>Upgrade dependencies (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/521";>#521</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/533";>#533</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/534";>#534</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/547";>#547</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/549";>#549</a>,</li>
   <li>Upgrade Maven plugins (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/520";>#520</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/538";>#538</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/539";>#539</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/540";>#540</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/546";>#546</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/550";>#550</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/553";>#553</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/554";>#554</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/555";>#555</a></li>
   </ul>
   <p>Release 1.5 (2025-06-27)</p>
   <ul>
   <li>Migrate publishing from OSSRH to Central Portal (jnioche, 
sebastian-nagel, Richard Zowalla, aecio) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/510";>#510</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/516";>#516</a></li>
   <li>[Sitemaps] Add cross-submit feature (Avi Hayun, kkrugler, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/85";>#85</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/515";>#515</a></li>
   <li>[Sitemaps] Complete sitemap extension attributes (sebastian-nagel, 
Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/513";>#513</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/514";>#514</a></li>
   <li>[Sitemaps] Allow partial extension metadata (adriabonetmrf, 
sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/456";>#456</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/458";>#458</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/512";>#512</a></li>
   <li>[Domains] EffectiveTldFinder to also take shorter suffix matches into 
account (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/479";>#479</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/505";>#505</a></li>
   <li>Add package-info.java to all packages (sebastian-nagel, Richard Zowalla) 
<a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/432";>#432</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/504";>#504</a></li>
   <li>[Robots.txt] Extend API to allow to check java.net.URL objects 
(sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/502";>#502</a></li>
   <li>[Robots.txt] Incorrect robots.txt result for uppercase user agents 
(teammakdi, sebastian-nagel, aecio, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/453";>#453</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/500";>#500</a></li>
   <li>Remove class utils.Strings (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/503";>#503</a></li>
   <li>[BasicNormalizer] Complete normalization feature list of 
BasicURLNormalizer (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/494";>#494</a></li>
   <li>[Robots] Document that URLs not properly normalized may not be matched 
by robots.txt parser (sebastian-nagel, kkrugler) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/492";>#492</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/493";>#493</a></li>
   <li>[Sitemaps] Added https variants of namespaces (jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/487";>#487</a></li>
   <li>[Domains] Add version of public suffix list shipped with release 
packages enhancement (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/433";>#433</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/484";>#484</a></li>
   <li>[Domains] Improve representation of public suffix match results by class 
EffectiveTLD (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/478";>#478</a></li>
   <li>Javadoc: fix links to Java core classes (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/417";>#417</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/483";>#483</a></li>
   <li>[Sitemaps] Improve logging done by SiteMapParser (Valery Yatsynovich, 
sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/457";>#457</a></li>
   <li>[Sitemaps] Google Sitemap PageMap extensions (josepowera, 
sebastian-nagel, Richard Zowalla, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/388";>#388</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/442";>#442</a></li>
   <li>[Domains] Installation of a gzip-compressed public suffix list from 
Maven cache breaks EffectiveTldFinder to address (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/441";>#441</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/443";>#443</a></li>
   <li>Upgrade dependencies (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/437";>#437</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/444";>#444</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/448";>#448</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/451";>#451</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/473";>#473</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/465";>#465</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/466";>#466</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/468";>#468</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/488";>#488</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/491";>#491</a>,
 <a href="https://redirect.github.com/crawler-co
 mmons/crawler-commons/issues/506">#506</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/511";>#511</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/517";>#517</a></li>
   <li>Upgrade Maven plugins (dependabot) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/434";>#434</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/438";>#438</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/439";>#439</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/449";>#449</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/445";>#445</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/452";>#452</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/455";>#455</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/459";>#459</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/460";>#460</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/464";>#464</a>,
 <a href="https://redirect.github.com/crawler-c
 ommons/crawler-commons/issues/469">#469</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/467";>#467</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/470";>#470</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/471";>#471</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/472";>#472</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/474";>#474</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/475";>#475</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/476";>#476</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/477";>#477</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/480";>#480</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/481";>#481</a>,
 <a href="https://redirect.github.com/crawl
 er-commons/crawler-commons/issues/482">#482</a>, <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/489";>#489</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/490";>#490</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/495";>#495</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/496";>#496</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/497";>#497</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/498";>#498</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/499";>#499</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/508";>#508</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/509";>#509</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/518";>#518</a></li>
   <li>Upgrade GitHub workflow actions v2 -&gt; v4 (sebastian-nagel, Richard 
Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/501";>#501</a></li>
   </ul>
   <p>Release 1.4 (2023-07-13)</p>
   <ul>
   <li>[Robots.txt] Implement Robots Exclusion Protocol (REP) IETF Draft: port 
unit tests (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/245";>#245</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/360";>#360</a></li>
   <li>[Robots.txt] Close groups of rules as defined in RFC 9309 (kkrugler, 
garyillyes, jnioche, sebastian-nagel) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/114";>#114</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/390";>#390</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/430";>#430</a></li>
   <li>[Robots.txt] Empty disallow statement not to clear other rules 
(sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/422";>#422</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/424";>#424</a></li>
   <li>[Robots.txt] SimpleRobotRulesParser main() to follow five redirects 
(sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/428";>#428</a></li>
   <li>[Robots.txt] Add more spelling variants and typos of robots.txt 
directives (sebastian-nagel, jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/425";>#425</a></li>
   <li>[Robots.txt] Document effect of rules merging in combination with 
multiple agent names (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/423";>#423</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/426";>#426</a></li>
   <li>[Robots.txt] Pass empty collection of agent names to select rules for 
any robot (wildcard user-agent name) (sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/427";>#427</a></li>
   <li>[Robots.txt] Rename default user-agent / robot name in unit tests 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/429";>#429</a></li>
   <li>[Robots.txt] Add units test based on examples in RFC 9309 
(sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/420";>#420</a></li>
   <li>[BasicNormalizer] Query parameters normalization in BasicURLNormalizer 
(aecio, sebastian-nagel, Richard Zowalla) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/308";>#308</a>,
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/421";>#421</a></li>
   <li>[Robots.txt] Deduplicate robots rules before matching (sebastian-nagel, 
jnioche) <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/416";>#416</a></li>
   </ul>
   <!-- raw HTML omitted -->
   </blockquote>
   <p>... (truncated)</p>
   </details>
   <details>
   <summary>Commits</summary>
   <ul>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/ce0fcb3e26dd653af93434a4cd64b1be4f7bce01";><code>ce0fcb3</code></a>
 [maven-release-plugin] prepare release crawler-commons-1.6</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/3d24b45253fdb8c2f1511b099af71d81d6666f2a";><code>3d24b45</code></a>
 Update CHANGES.txt for release of 1.6</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/9c822514b6c962c19a78f71ba3868cc5d5ab7352";><code>9c82251</code></a>
 Bump org.apache.maven.plugins:maven-source-plugin from 3.3.1 to 3.4.0</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/59cb351f3e37d03ca27ff0aa68eccd795196a746";><code>59cb351</code></a>
 Bump de.thetaphi:forbiddenapis from 3.9 to 3.10</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/4730575f7e907c1cda915c4cca209507ff90e13b";><code>4730575</code></a>
 Bump org.apache.maven.plugins:maven-jar-plugin from 3.4.2 to 3.5.0</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/5164d0629a604f76b6592408fb2a0ad7da209afc";><code>5164d06</code></a>
 Update changelog</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/e58ecbcf8892f579eaca640b7cd97905756535b4";><code>e58ecbc</code></a>
 <a 
href="https://redirect.github.com/crawler-commons/crawler-commons/issues/551";>#551</a>
 – Support IDNA2008 Unicode domains by using ALLOW_UNASSIGNED in IDN 
meth...</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/23aaeb29e9a354d747ac4ad9f9b6b3aa52db0856";><code>23aaeb2</code></a>
 Update Changelog</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/8eb9c8fc8963ea0aa9a13318fac6d4e874180b45";><code>8eb9c8f</code></a>
 Unit tests for URLUtils.resolve(...)</li>
   <li><a 
href="https://github.com/crawler-commons/crawler-commons/commit/94658e34f78cf31be1cefbf55b5a801e818b1e58";><code>94658e3</code></a>
 Update Changelog</li>
   <li>Additional commits viewable in <a 
href="https://github.com/crawler-commons/crawler-commons/compare/crawler-commons-1.5...crawler-commons-1.6";>compare
 view</a></li>
   </ul>
   </details>
   <br />
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=com.github.crawler-commons:crawler-commons&package-manager=maven&previous-version=1.5&new-version=1.6)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   <details>
   <summary>Dependabot commands and options</summary>
   <br />
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show <dependency name> ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   
   
   </details>


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to