Skip to content

Conversation

@jobselko
Copy link
Contributor

No description provided.

@jobselko jobselko self-assigned this Nov 21, 2025
@github-actions github-actions bot added multi-commit Add to bypass single commit lint check no-changelog no-issue labels Nov 21, 2025
@jobselko jobselko marked this pull request as draft November 21, 2025 22:11
@jobselko jobselko changed the title Implement PEP 658 [PULP-946] Implement PEP 658 Nov 21, 2025
@jobselko jobselko force-pushed the pep_658 branch 2 times, most recently from 848fa89 to 31cd665 Compare December 4, 2025 17:24
@jobselko jobselko force-pushed the pep_658 branch 6 times, most recently from 5e4f2cb to 69e1326 Compare December 16, 2025 15:44
Copy link
Contributor

@gerrod3 gerrod3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking pretty good.

remote=self.remote,
deferred_download=self.deferred_download,
)
d_artifacts.append(metadata_artifact)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One last thing we need to do is add the metadata_sha256 to the package since the PyPI json api doesn't have the value.

package.metadata_sha256 = md_sha256.

Comment on lines +305 to +309
finally:
if temp_wheel_path and os.path.exists(temp_wheel_path):
os.unlink(temp_wheel_path)
if temp_metadata_path and os.path.exists(temp_metadata_path):
os.unlink(temp_metadata_path)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Very nice to ensure we clean up the files, but if we assume this method is always called in either a task or with a tmp_dir do we need to clean up the files? The tmp_dir should autocleanup, even on error, deleting the tmp files with it, so I think we can remove this.

Comment on lines +80 to +81
html_content = response.text
assert f'data-dist-info-metadata="sha256={PYTHON_WHEEL_METADATA_SHA256}' in html_content
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe we can add this check into ensure_simple. Pass in a new metadata_sha_digests dict that should check if certain links have this field set correctly.

"""
Test that the sync of a Python wheel package creates a metadata artifact.
"""
remote = python_remote_factory(includes=PYTHON_XS_PROJECT_SPECIFIER)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should choose a different package since this test runs in parallel meaning the upload tests will also be creating the same content. All the wheels on the fixtures should have the metadata file available.

created_count = 0
skipped_count = 0

with tempfile.TemporaryDirectory() as temp_dir:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Going to want to use settings.WORKING_DIRECTORY here.

packages = (
PythonPackageContent.objects.filter(metadata_sha256__isnull=False)
.exclude(metadata_sha256="")
.prefetch_related("contentartifact_set")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
.prefetch_related("contentartifact_set")
.prefetch_related("contentartifact_set")
.only("filename", "metadata_sha256")

metadata_artifact.save()
except IntegrityError:
metadata_artifact = artifact_model.objects.get(
sha256=metadata_artifact.sha256, pulp_domain=get_domain()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
sha256=metadata_artifact.sha256, pulp_domain=get_domain()
sha256=metadata_artifact.sha256, pulp_domain=main_artifact.pulp_domain

get_domain() will always return the default domain in the migration. We need this migration to work across all domains.

@jobselko jobselko marked this pull request as ready for review December 16, 2025 22:08
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

multi-commit Add to bypass single commit lint check no-changelog no-issue

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants