Skip to content

Add support for an Index alongside the Catalog#332

Merged
chennes merged 1 commit intoFreeCAD:devfrom
chennes:supportIndexAndCatalog
Feb 8, 2026
Merged

Add support for an Index alongside the Catalog#332
chennes merged 1 commit intoFreeCAD:devfrom
chennes:supportIndexAndCatalog

Conversation

@chennes
Copy link
Member

@chennes chennes commented Feb 8, 2026

Also implement sparse checkout of the huge repos, and change the data source location for the cache.

Copilot AI review requested due to automatic review settings February 8, 2026 18:45
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR updates the addon cache generation flow to consume the new Addons Index.json, generate both an index cache and a curated-only catalog cache for backward compatibility, and adds sparse git cloning for very large repos.

Changes:

  • Switch cache source from the legacy AddonCatalog.json URL to the Addons Data/Index.json.
  • Generate a full addon_index_cache.zip plus a curated-only addon_catalog_cache.zip for older Addon Manager versions.
  • Add sparse-clone support and clone timeout/error reporting for large repositories.

Reviewed changes

Copilot reviewed 5 out of 5 changed files in this pull request and generated 9 comments.

Show a summary per file
File Description
AddonCatalogCacheCreator.py Fetch from Index.json, write new index+catalog cache artifacts, add sparse clone + timeout/error tracking.
AddonCatalog.py Add curated / sparse_cache fields and use curated flag on the instantiated Addon.
Addon.py Add Addon.curated property for UI/logic to consume.
AddonCatalog.schema.json Update schema to include the new curated boolean field.
AddonManagerTest/app/test_addon_catalog_cache_creator.py Adjust tests for the updated cache writer behavior (currently inconsistent with implementation).

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

]
completed_process = subprocess.run(command)
try:
completed_process = subprocess.run(command, CLONE_TIMEOUT)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In clone_or_update, subprocess.run(command, CLONE_TIMEOUT) is passing CLONE_TIMEOUT as a positional arg (it becomes bufsize), so the clone will not time out. Use timeout=CLONE_TIMEOUT (and consider check=True to raise on non-zero exit) so oversized repos can be handled as intended.

Suggested change
completed_process = subprocess.run(command, CLONE_TIMEOUT)
completed_process = subprocess.run(command, timeout=CLONE_TIMEOUT)

Copilot uses AI. Check for mistakes.
Comment on lines 138 to 145
# Write the entire index for versions of the Addon Manager after 2026-01-24
with zipfile.ZipFile(
os.path.join(self.cwd, "addon_index_cache.zip"), "w", zipfile.ZIP_DEFLATED
) as zipf:
zipf.writestr(
"addon_index_cache.json",
json.dumps(recursive_serialize(self.catalog.get_catalog()), indent=" "),
)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This adds a new addon_index_cache.zip artifact, but unlike the other caches it does not generate a corresponding .sha256. addonmanager_workers_startup.new_cache_available() expects a <cache_name>_cache_url + '.sha256' endpoint, so the index cache may not be refreshable/detectable consistently. Consider writing addon_index_cache.zip.sha256 as well (and updating the final log message accordingly).

Copilot uses AI. Check for mistakes.
Comment on lines 270 to 273
writer = accc.CacheWriter()
writer.catalog = MagicMock()
writer.index = MagicMock()
writer.cwd = os.path.abspath(os.path.join("home", "cache"))
writer.create_local_copy_of_single_addon("TestMod", catalog_entries)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These tests now set writer.index = MagicMock(), but CacheWriter uses self.catalog (e.g., self.catalog.add_metadata_to_entry(...) in create_local_copy_of_single_addon). With catalog left as None, the test will raise AttributeError. Update the test to set writer.catalog (or mock the add_metadata_to_entry call path) instead of writer.index.

Copilot uses AI. Check for mistakes.
Comment on lines 290 to 293
writer = accc.CacheWriter()
writer.catalog = MagicMock()
writer.index = MagicMock()
writer.cwd = os.path.abspath(os.path.join("home", "cache"))
writer.create_local_copy_of_single_addon("TestMod", catalog_entries)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same issue as above: writer.index is not used by CacheWriter, but create_local_copy_of_single_addon() relies on writer.catalog being set to something that has add_metadata_to_entry. Setting writer.index here will cause failures when the test hits the metadata/zip steps.

Copilot uses AI. Check for mistakes.
Comment on lines 316 to 324
mock_fetch_catalog.return_value = MockCatalog()
writer = accc.CacheWriter()
writer.index = MockCatalog()
writer.create_local_copy_of_addons()
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

create_local_copy_of_addons() iterates self.catalog.get_catalog(), but this test sets writer.index and never sets writer.catalog, so it will fail with AttributeError. Also, the test still references accc.EXCLUDED_REPOS, which no longer exists (and the production code no longer skips those repos). Update the test data/expectations to match the new behavior (e.g., set writer.catalog = MockCatalog() and adjust the skipped-repo assertion).

Copilot uses AI. Check for mistakes.
Comment on lines +253 to +255
def test_generate_cache_entry_with_approval(self):
"""If the addon appears in the catalog (as opposed to just the index), it gets marked as approved."""

Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

test_generate_cache_entry_with_approval currently has no assertions and will always pass, so it doesn't actually validate the new curated/approved behavior described in the docstring. Either implement the test logic (assert curated filtering/flagging) or remove the placeholder to avoid giving a false sense of coverage.

Copilot uses AI. Check for mistakes.
Comment on lines 121 to 137
def write(self, addon_id: Optional[str] = None) -> None:
original_working_directory = os.getcwd()
os.makedirs(self.cwd, exist_ok=True)
os.chdir(self.cwd)
self.create_local_copy_of_addons()

fetcher = CatalogFetcher()
self.catalog = fetcher.catalog

if addon_id is None:
self.create_local_copy_of_addons()
else:
catalog = self.catalog.get_catalog()
if addon_id not in catalog:
raise RuntimeError(f"ERROR: Addon {addon_id} not in index")
catalog_entries = catalog[addon_id]
self.create_local_copy_of_single_addon(addon_id, catalog_entries)

Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

write() changes the process working directory with os.chdir(self.cwd) but does not restore it if an exception occurs (e.g., addon_id not in index, fetch failure, zip write error). Wrap the body in try/finally and always os.chdir(original_working_directory) to avoid leaking state to callers/tests.

Copilot uses AI. Check for mistakes.
Comment on lines 194 to 199
if addon_id in FORCE_SPARSE_CLONE:
if catalog_entry.repository is None:
print(
f"ERROR: Cannot use sparse clone for {addon_id} because it has no git repo."
)
return
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the FORCE_SPARSE_CLONE path, if an entry has no repository you return early, which skips caching any remaining entries (including ones that might have a usable zip_url). Prefer handling that entry with the normal zip path (or continue) rather than aborting the whole addon.

Suggested change
if addon_id in FORCE_SPARSE_CLONE:
if catalog_entry.repository is None:
print(
f"ERROR: Cannot use sparse clone for {addon_id} because it has no git repo."
)
return
use_sparse_clone = addon_id in FORCE_SPARSE_CLONE and catalog_entry.repository is not None
if use_sparse_clone:

Copilot uses AI. Check for mistakes.
Comment on lines 141 to 143
else:
# TODO: Try to generate the expected URL form based on the repo location
raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url")
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instantiate_addon() raises RuntimeError when sparse_cache is true but zip_url is missing. Since CacheWriter can set sparse_cache=True without ensuring zip_url exists, this can cause addons/branches to be skipped at runtime. Consider falling back to repository when zip_url is absent (or validate/enforce zip_url at cache-generation time).

Suggested change
else:
# TODO: Try to generate the expected URL form based on the repo location
raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url")
elif self.repository:
# Fallback: use the repository URL when a sparse cache entry has no explicit zip_url
url = self.repository
else:
raise RuntimeError(
f"Sparse cache entry {addon_id} has neither zip_url nor repository URL"
)

Copilot uses AI. Check for mistakes.
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Copilot reviewed 5 out of 5 changed files in this pull request and generated 2 comments.

Comments suppressed due to low confidence (1)

AddonCatalogCacheCreator.py:416

  • CacheWriter.clone_or_update was changed from a @staticmethod to an instance method, but other code still calls it as a static utility (e.g. MacroCacheCreator.py imports CacheWriter specifically to call CacheWriter.clone_or_update(...)). This change will raise a TypeError at runtime. Consider keeping clone_or_update as a static/class method (and passing an optional error sink/timeout), or update all external callers accordingly as part of this PR.
    def clone_or_update(self, name: str, url: str, branch: str) -> None:
        """If a directory called "name" exists, and it contains a subdirectory called .git,
        then 'git fetch' is called; otherwise we use 'git clone' to make a bare, shallow
        copy of the repo (in the normal case where minimal is True), or a normal clone,
        if minimal is set to False."""

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +210 to +213
catalog_entry.sparse_cache = True
self.create_local_copy_of_single_addon_with_git_sparse(
addon_id, index, catalog_entry
)
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When addon_id is in FORCE_SPARSE_CLONE, the code unconditionally sets catalog_entry.sparse_cache = True but does not ensure catalog_entry.zip_url is present. Downstream, AddonCatalogEntry.instantiate_addon() raises if sparse_cache is true and zip_url is missing, so entries that only provide a git repo/ref will break. Either require/derive zip_url before setting sparse_cache, or fall back to the normal git path when zip_url is absent.

Suggested change
catalog_entry.sparse_cache = True
self.create_local_copy_of_single_addon_with_git_sparse(
addon_id, index, catalog_entry
)
if catalog_entry.zip_url is not None:
catalog_entry.sparse_cache = True
self.create_local_copy_of_single_addon_with_git_sparse(
addon_id, index, catalog_entry
)
else:
print(
f"WARNING: Cannot use sparse clone for {addon_id} because it has no zip_url; "
"falling back to normal git clone."
)
self.create_local_copy_of_single_addon_with_git(
addon_id, index, catalog_entry
)

Copilot uses AI. Check for mistakes.
Comment on lines 138 to 147
if self.sparse_cache:
if self.zip_url:
url = self.zip_url
else:
# TODO: Try to generate the expected URL form based on the repo location
raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url")
elif self.repository:
url = self.repository
else:
url = self.zip_url
Copy link

Copilot AI Feb 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

instantiate_addon() raises a RuntimeError if sparse_cache is true but zip_url is missing. Since zip_url is optional for git-based entries (and sparse_cache is set automatically during cache generation for some repos), this can cause Addon Manager to crash while building the addon list. Consider a safe fallback (e.g. use repository when zip_url is absent) or make cache generation guarantee zip_url for sparse entries.

Copilot uses AI. Check for mistakes.
@chennes chennes force-pushed the supportIndexAndCatalog branch from 105ff4e to df3ef56 Compare February 8, 2026 19:39
Also implement sparse checkout of the huge repos, and change the data source location for the cache.
@chennes chennes force-pushed the supportIndexAndCatalog branch from df3ef56 to d787321 Compare February 8, 2026 20:44
@chennes chennes merged commit a4855d4 into FreeCAD:dev Feb 8, 2026
10 of 11 checks passed
@chennes chennes deleted the supportIndexAndCatalog branch February 8, 2026 23:51
@chennes chennes added the release to main Trigger an Action to create a PR backporting to `main` label Feb 8, 2026
@github-actions
Copy link

github-actions bot commented Feb 8, 2026

Backport failed for main, because it was unable to cherry-pick the commit(s).

Please cherry-pick the changes locally and resolve any conflicts.

git fetch origin main
git worktree add -d .worktree/backport-332-to-main origin/main
cd .worktree/backport-332-to-main
git switch --create backport-332-to-main
git cherry-pick -x d787321a42f4f3c2d16e0151095817a6db501c44

1 similar comment
@github-actions
Copy link

github-actions bot commented Feb 9, 2026

Backport failed for main, because it was unable to cherry-pick the commit(s).

Please cherry-pick the changes locally and resolve any conflicts.

git fetch origin main
git worktree add -d .worktree/backport-332-to-main origin/main
cd .worktree/backport-332-to-main
git switch --create backport-332-to-main
git cherry-pick -x d787321a42f4f3c2d16e0151095817a6db501c44

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

release to main Trigger an Action to create a PR backporting to `main`

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant