Add support for an Index alongside the Catalog#332
Conversation
There was a problem hiding this comment.
Pull request overview
This PR updates the addon cache generation flow to consume the new Addons Index.json, generate both an index cache and a curated-only catalog cache for backward compatibility, and adds sparse git cloning for very large repos.
Changes:
- Switch cache source from the legacy
AddonCatalog.jsonURL to the AddonsData/Index.json. - Generate a full
addon_index_cache.zipplus a curated-onlyaddon_catalog_cache.zipfor older Addon Manager versions. - Add sparse-clone support and clone timeout/error reporting for large repositories.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
AddonCatalogCacheCreator.py |
Fetch from Index.json, write new index+catalog cache artifacts, add sparse clone + timeout/error tracking. |
AddonCatalog.py |
Add curated / sparse_cache fields and use curated flag on the instantiated Addon. |
Addon.py |
Add Addon.curated property for UI/logic to consume. |
AddonCatalog.schema.json |
Update schema to include the new curated boolean field. |
AddonManagerTest/app/test_addon_catalog_cache_creator.py |
Adjust tests for the updated cache writer behavior (currently inconsistent with implementation). |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
AddonCatalogCacheCreator.py
Outdated
| ] | ||
| completed_process = subprocess.run(command) | ||
| try: | ||
| completed_process = subprocess.run(command, CLONE_TIMEOUT) |
There was a problem hiding this comment.
In clone_or_update, subprocess.run(command, CLONE_TIMEOUT) is passing CLONE_TIMEOUT as a positional arg (it becomes bufsize), so the clone will not time out. Use timeout=CLONE_TIMEOUT (and consider check=True to raise on non-zero exit) so oversized repos can be handled as intended.
| completed_process = subprocess.run(command, CLONE_TIMEOUT) | |
| completed_process = subprocess.run(command, timeout=CLONE_TIMEOUT) |
AddonCatalogCacheCreator.py
Outdated
| # Write the entire index for versions of the Addon Manager after 2026-01-24 | ||
| with zipfile.ZipFile( | ||
| os.path.join(self.cwd, "addon_index_cache.zip"), "w", zipfile.ZIP_DEFLATED | ||
| ) as zipf: | ||
| zipf.writestr( | ||
| "addon_index_cache.json", | ||
| json.dumps(recursive_serialize(self.catalog.get_catalog()), indent=" "), | ||
| ) |
There was a problem hiding this comment.
This adds a new addon_index_cache.zip artifact, but unlike the other caches it does not generate a corresponding .sha256. addonmanager_workers_startup.new_cache_available() expects a <cache_name>_cache_url + '.sha256' endpoint, so the index cache may not be refreshable/detectable consistently. Consider writing addon_index_cache.zip.sha256 as well (and updating the final log message accordingly).
| writer = accc.CacheWriter() | ||
| writer.catalog = MagicMock() | ||
| writer.index = MagicMock() | ||
| writer.cwd = os.path.abspath(os.path.join("home", "cache")) | ||
| writer.create_local_copy_of_single_addon("TestMod", catalog_entries) |
There was a problem hiding this comment.
These tests now set writer.index = MagicMock(), but CacheWriter uses self.catalog (e.g., self.catalog.add_metadata_to_entry(...) in create_local_copy_of_single_addon). With catalog left as None, the test will raise AttributeError. Update the test to set writer.catalog (or mock the add_metadata_to_entry call path) instead of writer.index.
| writer = accc.CacheWriter() | ||
| writer.catalog = MagicMock() | ||
| writer.index = MagicMock() | ||
| writer.cwd = os.path.abspath(os.path.join("home", "cache")) | ||
| writer.create_local_copy_of_single_addon("TestMod", catalog_entries) |
There was a problem hiding this comment.
Same issue as above: writer.index is not used by CacheWriter, but create_local_copy_of_single_addon() relies on writer.catalog being set to something that has add_metadata_to_entry. Setting writer.index here will cause failures when the test hits the metadata/zip steps.
| mock_fetch_catalog.return_value = MockCatalog() | ||
| writer = accc.CacheWriter() | ||
| writer.index = MockCatalog() | ||
| writer.create_local_copy_of_addons() |
There was a problem hiding this comment.
create_local_copy_of_addons() iterates self.catalog.get_catalog(), but this test sets writer.index and never sets writer.catalog, so it will fail with AttributeError. Also, the test still references accc.EXCLUDED_REPOS, which no longer exists (and the production code no longer skips those repos). Update the test data/expectations to match the new behavior (e.g., set writer.catalog = MockCatalog() and adjust the skipped-repo assertion).
| def test_generate_cache_entry_with_approval(self): | ||
| """If the addon appears in the catalog (as opposed to just the index), it gets marked as approved.""" | ||
|
|
There was a problem hiding this comment.
test_generate_cache_entry_with_approval currently has no assertions and will always pass, so it doesn't actually validate the new curated/approved behavior described in the docstring. Either implement the test logic (assert curated filtering/flagging) or remove the placeholder to avoid giving a false sense of coverage.
| def write(self, addon_id: Optional[str] = None) -> None: | ||
| original_working_directory = os.getcwd() | ||
| os.makedirs(self.cwd, exist_ok=True) | ||
| os.chdir(self.cwd) | ||
| self.create_local_copy_of_addons() | ||
|
|
||
| fetcher = CatalogFetcher() | ||
| self.catalog = fetcher.catalog | ||
|
|
||
| if addon_id is None: | ||
| self.create_local_copy_of_addons() | ||
| else: | ||
| catalog = self.catalog.get_catalog() | ||
| if addon_id not in catalog: | ||
| raise RuntimeError(f"ERROR: Addon {addon_id} not in index") | ||
| catalog_entries = catalog[addon_id] | ||
| self.create_local_copy_of_single_addon(addon_id, catalog_entries) | ||
|
|
There was a problem hiding this comment.
write() changes the process working directory with os.chdir(self.cwd) but does not restore it if an exception occurs (e.g., addon_id not in index, fetch failure, zip write error). Wrap the body in try/finally and always os.chdir(original_working_directory) to avoid leaking state to callers/tests.
AddonCatalogCacheCreator.py
Outdated
| if addon_id in FORCE_SPARSE_CLONE: | ||
| if catalog_entry.repository is None: | ||
| print( | ||
| f"ERROR: Cannot use sparse clone for {addon_id} because it has no git repo." | ||
| ) | ||
| return |
There was a problem hiding this comment.
In the FORCE_SPARSE_CLONE path, if an entry has no repository you return early, which skips caching any remaining entries (including ones that might have a usable zip_url). Prefer handling that entry with the normal zip path (or continue) rather than aborting the whole addon.
| if addon_id in FORCE_SPARSE_CLONE: | |
| if catalog_entry.repository is None: | |
| print( | |
| f"ERROR: Cannot use sparse clone for {addon_id} because it has no git repo." | |
| ) | |
| return | |
| use_sparse_clone = addon_id in FORCE_SPARSE_CLONE and catalog_entry.repository is not None | |
| if use_sparse_clone: |
| else: | ||
| # TODO: Try to generate the expected URL form based on the repo location | ||
| raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url") |
There was a problem hiding this comment.
instantiate_addon() raises RuntimeError when sparse_cache is true but zip_url is missing. Since CacheWriter can set sparse_cache=True without ensuring zip_url exists, this can cause addons/branches to be skipped at runtime. Consider falling back to repository when zip_url is absent (or validate/enforce zip_url at cache-generation time).
| else: | |
| # TODO: Try to generate the expected URL form based on the repo location | |
| raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url") | |
| elif self.repository: | |
| # Fallback: use the repository URL when a sparse cache entry has no explicit zip_url | |
| url = self.repository | |
| else: | |
| raise RuntimeError( | |
| f"Sparse cache entry {addon_id} has neither zip_url nor repository URL" | |
| ) |
ed1fdae to
73635be
Compare
73635be to
105ff4e
Compare
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 5 out of 5 changed files in this pull request and generated 2 comments.
Comments suppressed due to low confidence (1)
AddonCatalogCacheCreator.py:416
CacheWriter.clone_or_updatewas changed from a@staticmethodto an instance method, but other code still calls it as a static utility (e.g.MacroCacheCreator.pyimportsCacheWriterspecifically to callCacheWriter.clone_or_update(...)). This change will raise aTypeErrorat runtime. Consider keepingclone_or_updateas a static/class method (and passing an optional error sink/timeout), or update all external callers accordingly as part of this PR.
def clone_or_update(self, name: str, url: str, branch: str) -> None:
"""If a directory called "name" exists, and it contains a subdirectory called .git,
then 'git fetch' is called; otherwise we use 'git clone' to make a bare, shallow
copy of the repo (in the normal case where minimal is True), or a normal clone,
if minimal is set to False."""
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| catalog_entry.sparse_cache = True | ||
| self.create_local_copy_of_single_addon_with_git_sparse( | ||
| addon_id, index, catalog_entry | ||
| ) |
There was a problem hiding this comment.
When addon_id is in FORCE_SPARSE_CLONE, the code unconditionally sets catalog_entry.sparse_cache = True but does not ensure catalog_entry.zip_url is present. Downstream, AddonCatalogEntry.instantiate_addon() raises if sparse_cache is true and zip_url is missing, so entries that only provide a git repo/ref will break. Either require/derive zip_url before setting sparse_cache, or fall back to the normal git path when zip_url is absent.
| catalog_entry.sparse_cache = True | |
| self.create_local_copy_of_single_addon_with_git_sparse( | |
| addon_id, index, catalog_entry | |
| ) | |
| if catalog_entry.zip_url is not None: | |
| catalog_entry.sparse_cache = True | |
| self.create_local_copy_of_single_addon_with_git_sparse( | |
| addon_id, index, catalog_entry | |
| ) | |
| else: | |
| print( | |
| f"WARNING: Cannot use sparse clone for {addon_id} because it has no zip_url; " | |
| "falling back to normal git clone." | |
| ) | |
| self.create_local_copy_of_single_addon_with_git( | |
| addon_id, index, catalog_entry | |
| ) |
| if self.sparse_cache: | ||
| if self.zip_url: | ||
| url = self.zip_url | ||
| else: | ||
| # TODO: Try to generate the expected URL form based on the repo location | ||
| raise RuntimeError(f"Sparse cache entry {addon_id} has no zip_url") | ||
| elif self.repository: | ||
| url = self.repository | ||
| else: | ||
| url = self.zip_url |
There was a problem hiding this comment.
instantiate_addon() raises a RuntimeError if sparse_cache is true but zip_url is missing. Since zip_url is optional for git-based entries (and sparse_cache is set automatically during cache generation for some repos), this can cause Addon Manager to crash while building the addon list. Consider a safe fallback (e.g. use repository when zip_url is absent) or make cache generation guarantee zip_url for sparse entries.
105ff4e to
df3ef56
Compare
Also implement sparse checkout of the huge repos, and change the data source location for the cache.
df3ef56 to
d787321
Compare
|
Backport failed for Please cherry-pick the changes locally and resolve any conflicts. git fetch origin main
git worktree add -d .worktree/backport-332-to-main origin/main
cd .worktree/backport-332-to-main
git switch --create backport-332-to-main
git cherry-pick -x d787321a42f4f3c2d16e0151095817a6db501c44 |
1 similar comment
|
Backport failed for Please cherry-pick the changes locally and resolve any conflicts. git fetch origin main
git worktree add -d .worktree/backport-332-to-main origin/main
cd .worktree/backport-332-to-main
git switch --create backport-332-to-main
git cherry-pick -x d787321a42f4f3c2d16e0151095817a6db501c44 |
Also implement sparse checkout of the huge repos, and change the data source location for the cache.