Skip to content

Conversation

@timcoding1988
Copy link
Collaborator

@timcoding1988 timcoding1988 commented Nov 12, 2025

new vm image sfx for fedora 43 base image update -> containers/automation_images#426

Checklist

Ensure you have completed the following checklist for your pull request to be reviewed:

  • Certify you wrote the patch or otherwise have the right to pass it on as an open-source patch by signing all
    commits. (git commit -s). (If needed, use git commit -s --amend). The author email must match
    the sign-off email address. See CONTRIBUTING.md
    for more information.
  • Referenced issues using Fixes: #00000 in commit message (if applicable)
  • Tests have been added/updated (or no tests are needed)
  • Documentation has been updated (or no documentation changes are needed)
  • All commits pass make validatepr (format/lint checks)
  • Release note entered in the section below (or None if no user-facing changes)

Does this PR introduce a user-facing change?

None

@openshift-ci
Copy link
Contributor

openshift-ci bot commented Nov 12, 2025

[APPROVALNOTIFIER] This PR is NOT APPROVED

This pull-request has been approved by: timcoding1988
Once this PR has been reviewed and has the lgtm label, please assign lsm5 for approval. For more information see the Code Review Process.

The full list of commands accepted by this bot can be found here.

Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@timcoding1988 timcoding1988 requested a review from Luap99 November 12, 2025 02:34
@timcoding1988 timcoding1988 force-pushed the new_image_sfx_for_fedora_43 branch from 26f0a45 to 1c9d3f9 Compare November 12, 2025 02:38
@packit-as-a-service
Copy link

[NON-BLOCKING] Packit jobs failed. @containers/packit-build please check. Everyone else, feel free to ignore.

3 similar comments
@packit-as-a-service
Copy link

[NON-BLOCKING] Packit jobs failed. @containers/packit-build please check. Everyone else, feel free to ignore.

@packit-as-a-service
Copy link

[NON-BLOCKING] Packit jobs failed. @containers/packit-build please check. Everyone else, feel free to ignore.

@packit-as-a-service
Copy link

[NON-BLOCKING] Packit jobs failed. @containers/packit-build please check. Everyone else, feel free to ignore.

@timcoding1988 timcoding1988 marked this pull request as draft November 12, 2025 15:07
@openshift-ci openshift-ci bot added the do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. label Nov 12, 2025
@timcoding1988 timcoding1988 force-pushed the new_image_sfx_for_fedora_43 branch from 9040a15 to 61ff57c Compare November 13, 2025 13:27
@timcoding1988 timcoding1988 marked this pull request as ready for review November 13, 2025 13:27
@openshift-ci openshift-ci bot removed the do-not-merge/work-in-progress Indicates that a PR should not merge because it is a work in progress. label Nov 13, 2025
@Luap99
Copy link
Member

Luap99 commented Nov 13, 2025

         =================================== FAILURES ===================================
         ____________________ ImageTestCase.test_search_compat (i=1) ____________________
         
         self = <python.rest_api.test_v2_0_0_image.ImageTestCase testMethod=test_search_compat>
         
             def test_search_compat(self):
                 url = self.podman_url + "/v1.40/images/search"
             
                 # Had issues with this test hanging when repositories not happy
                 def do_search1():
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#1: {r.text}")
                     self.assertIsInstance(r.json(), list)
             
                 def do_search2():
                     payload = {"term": "alpine", "limit": 1}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#2: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
                     self.assertEqual(len(results), 1)
             
                 def do_search3():
                     # FIXME: Research if quay.io supports is-official and which image is "official"
                     return
                     payload = {"term": "thanos", "filters": '{"is-official":["true"]}'}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#3: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
             
                     # There should be only one official image
                     self.assertEqual(len(results), 1)
             
                 def do_search4():
                     headers = {"X-Registry-Auth": "null"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#4: {r.text}")
             
                 def do_search5():
                     headers = {"X-Registry-Auth": "invalid value"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 400, f"#5: {r.text}")
             
                 i = 1
                 for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
                     with self.subTest(i=i):
                         search = Process(target=fn)
         >               search.start()
         
         test/apiv2/python/rest_api/test_v2_0_0_image.py:174: 
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         /usr/lib64/python3.14/multiprocessing/process.py:121: in start
             self._popen = self._Popen(self)
                           ^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
             return _default_context.get_context().Process._Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
             return Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
             super().__init__(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
             self._launch(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
             reduction.dump(process_obj, buf)
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         
         obj = <Process name='Process-1' parent=33305 initial>
         file = <_io.BytesIO object at 0x7f5462f45f30>, protocol = None
         
             def dump(obj, file, protocol=None):
                 '''Replacement for pickle.dump() using ForkingPickler.'''
         >       ForkingPickler(file, protocol).dump(obj)
         E       _pickle.PicklingError: Can't pickle local object <function ImageTestCase.test_search_compat.<locals>.do_search1 at 0x7f5462f824b0>
         E       when serializing dict item '_target'
         E       when serializing multiprocessing.context.Process state
         E       when serializing multiprocessing.context.Process object
         
         /usr/lib64/python3.14/multiprocessing/reduction.py:60: PicklingError
         ____________________ ImageTestCase.test_search_compat (i=1) ____________________
         
         self = <python.rest_api.test_v2_0_0_image.ImageTestCase testMethod=test_search_compat>
         
             def test_search_compat(self):
                 url = self.podman_url + "/v1.40/images/search"
             
                 # Had issues with this test hanging when repositories not happy
                 def do_search1():
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#1: {r.text}")
                     self.assertIsInstance(r.json(), list)
             
                 def do_search2():
                     payload = {"term": "alpine", "limit": 1}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#2: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
                     self.assertEqual(len(results), 1)
             
                 def do_search3():
                     # FIXME: Research if quay.io supports is-official and which image is "official"
                     return
                     payload = {"term": "thanos", "filters": '{"is-official":["true"]}'}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#3: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
             
                     # There should be only one official image
                     self.assertEqual(len(results), 1)
             
                 def do_search4():
                     headers = {"X-Registry-Auth": "null"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#4: {r.text}")
             
                 def do_search5():
                     headers = {"X-Registry-Auth": "invalid value"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 400, f"#5: {r.text}")
             
                 i = 1
                 for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
                     with self.subTest(i=i):
                         search = Process(target=fn)
         >               search.start()
         
         test/apiv2/python/rest_api/test_v2_0_0_image.py:174: 
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         /usr/lib64/python3.14/multiprocessing/process.py:121: in start
             self._popen = self._Popen(self)
                           ^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
             return _default_context.get_context().Process._Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
             return Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
             super().__init__(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
             self._launch(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
             reduction.dump(process_obj, buf)
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         
         obj = <Process name='Process-2' parent=33305 initial>
         file = <_io.BytesIO object at 0x7f54631fbb00>, protocol = None
         
             def dump(obj, file, protocol=None):
                 '''Replacement for pickle.dump() using ForkingPickler.'''
         >       ForkingPickler(file, protocol).dump(obj)
         E       _pickle.PicklingError: Can't pickle local object <function ImageTestCase.test_search_compat.<locals>.do_search2 at 0x7f5462f82560>
         E       when serializing dict item '_target'
         E       when serializing multiprocessing.context.Process state
         E       when serializing multiprocessing.context.Process object
         
         /usr/lib64/python3.14/multiprocessing/reduction.py:60: PicklingError
         ____________________ ImageTestCase.test_search_compat (i=1) ____________________
         
         self = <python.rest_api.test_v2_0_0_image.ImageTestCase testMethod=test_search_compat>
         
             def test_search_compat(self):
                 url = self.podman_url + "/v1.40/images/search"
             
                 # Had issues with this test hanging when repositories not happy
                 def do_search1():
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#1: {r.text}")
                     self.assertIsInstance(r.json(), list)
             
                 def do_search2():
                     payload = {"term": "alpine", "limit": 1}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#2: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
                     self.assertEqual(len(results), 1)
             
                 def do_search3():
                     # FIXME: Research if quay.io supports is-official and which image is "official"
                     return
                     payload = {"term": "thanos", "filters": '{"is-official":["true"]}'}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#3: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
             
                     # There should be only one official image
                     self.assertEqual(len(results), 1)
             
                 def do_search4():
                     headers = {"X-Registry-Auth": "null"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#4: {r.text}")
             
                 def do_search5():
                     headers = {"X-Registry-Auth": "invalid value"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 400, f"#5: {r.text}")
             
                 i = 1
                 for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
                     with self.subTest(i=i):
                         search = Process(target=fn)
         >               search.start()
         
         test/apiv2/python/rest_api/test_v2_0_0_image.py:174: 
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         /usr/lib64/python3.14/multiprocessing/process.py:121: in start
             self._popen = self._Popen(self)
                           ^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
             return _default_context.get_context().Process._Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
             return Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
             super().__init__(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
             self._launch(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
             reduction.dump(process_obj, buf)
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         
         obj = <Process name='Process-3' parent=33305 initial>
         file = <_io.BytesIO object at 0x7f546327be70>, protocol = None
         
             def dump(obj, file, protocol=None):
                 '''Replacement for pickle.dump() using ForkingPickler.'''
         >       ForkingPickler(file, protocol).dump(obj)
         E       _pickle.PicklingError: Can't pickle local object <function ImageTestCase.test_search_compat.<locals>.do_search3 at 0x7f5462f82770>
         E       when serializing dict item '_target'
         E       when serializing multiprocessing.context.Process state
         E       when serializing multiprocessing.context.Process object
         
         /usr/lib64/python3.14/multiprocessing/reduction.py:60: PicklingError
         ____________________ ImageTestCase.test_search_compat (i=1) ____________________
         
         self = <python.rest_api.test_v2_0_0_image.ImageTestCase testMethod=test_search_compat>
         
             def test_search_compat(self):
                 url = self.podman_url + "/v1.40/images/search"
             
                 # Had issues with this test hanging when repositories not happy
                 def do_search1():
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#1: {r.text}")
                     self.assertIsInstance(r.json(), list)
             
                 def do_search2():
                     payload = {"term": "alpine", "limit": 1}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#2: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
                     self.assertEqual(len(results), 1)
             
                 def do_search3():
                     # FIXME: Research if quay.io supports is-official and which image is "official"
                     return
                     payload = {"term": "thanos", "filters": '{"is-official":["true"]}'}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#3: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
             
                     # There should be only one official image
                     self.assertEqual(len(results), 1)
             
                 def do_search4():
                     headers = {"X-Registry-Auth": "null"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#4: {r.text}")
             
                 def do_search5():
                     headers = {"X-Registry-Auth": "invalid value"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 400, f"#5: {r.text}")
             
                 i = 1
                 for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
                     with self.subTest(i=i):
                         search = Process(target=fn)
         >               search.start()
         
         test/apiv2/python/rest_api/test_v2_0_0_image.py:174: 
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         /usr/lib64/python3.14/multiprocessing/process.py:121: in start
             self._popen = self._Popen(self)
                           ^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
             return _default_context.get_context().Process._Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
             return Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
             super().__init__(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
             self._launch(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
             reduction.dump(process_obj, buf)
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         
         obj = <Process name='Process-4' parent=33305 initial>
         file = <_io.BytesIO object at 0x7f546327b290>, protocol = None
         
             def dump(obj, file, protocol=None):
                 '''Replacement for pickle.dump() using ForkingPickler.'''
         >       ForkingPickler(file, protocol).dump(obj)
         E       _pickle.PicklingError: Can't pickle local object <function ImageTestCase.test_search_compat.<locals>.do_search4 at 0x7f5462f82610>
         E       when serializing dict item '_target'
         E       when serializing multiprocessing.context.Process state
         E       when serializing multiprocessing.context.Process object
         
         /usr/lib64/python3.14/multiprocessing/reduction.py:60: PicklingError
         ____________________ ImageTestCase.test_search_compat (i=1) ____________________
         
         self = <python.rest_api.test_v2_0_0_image.ImageTestCase testMethod=test_search_compat>
         
             def test_search_compat(self):
                 url = self.podman_url + "/v1.40/images/search"
             
                 # Had issues with this test hanging when repositories not happy
                 def do_search1():
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#1: {r.text}")
                     self.assertIsInstance(r.json(), list)
             
                 def do_search2():
                     payload = {"term": "alpine", "limit": 1}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#2: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
                     self.assertEqual(len(results), 1)
             
                 def do_search3():
                     # FIXME: Research if quay.io supports is-official and which image is "official"
                     return
                     payload = {"term": "thanos", "filters": '{"is-official":["true"]}'}
                     r = requests.get(url, params=payload, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#3: {r.text}")
             
                     results = r.json()
                     self.assertIsInstance(results, list)
             
                     # There should be only one official image
                     self.assertEqual(len(results), 1)
             
                 def do_search4():
                     headers = {"X-Registry-Auth": "null"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 200, f"#4: {r.text}")
             
                 def do_search5():
                     headers = {"X-Registry-Auth": "invalid value"}
                     payload = {"term": "alpine"}
                     r = requests.get(url, params=payload, headers=headers, timeout=5)
                     self.assertEqual(r.status_code, 400, f"#5: {r.text}")
             
                 i = 1
                 for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
                     with self.subTest(i=i):
                         search = Process(target=fn)
         >               search.start()
         
         test/apiv2/python/rest_api/test_v2_0_0_image.py:174: 
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         /usr/lib64/python3.14/multiprocessing/process.py:121: in start
             self._popen = self._Popen(self)
                           ^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:224: in _Popen
             return _default_context.get_context().Process._Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/context.py:300: in _Popen
             return Popen(process_obj)
                    ^^^^^^^^^^^^^^^^^^
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:35: in __init__
             super().__init__(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_fork.py:20: in __init__
             self._launch(process_obj)
         /usr/lib64/python3.14/multiprocessing/popen_forkserver.py:47: in _launch
             reduction.dump(process_obj, buf)
         _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
         
         obj = <Process name='Process-5' parent=33305 initial>
         file = <_io.BytesIO object at 0x7f546327bc90>, protocol = None
         
             def dump(obj, file, protocol=None):
                 '''Replacement for pickle.dump() using ForkingPickler.'''
         >       ForkingPickler(file, protocol).dump(obj)
         E       _pickle.PicklingError: Can't pickle local object <function ImageTestCase.test_search_compat.<locals>.do_search5 at 0x7f5462f826c0>
         E       when serializing dict item '_target'
         E       when serializing multiprocessing.context.Process state
         E       when serializing multiprocessing.context.Process object
         
         /usr/lib64/python3.14/multiprocessing/reduction.py:60: PicklingError

https://api.cirrus-ci.com/v1/artifact/task/6535796057964544/html/apiv2-podman-fedora-43-root-host.log.html
Looks like something in pthon changed? I don't get what it is trying to say myabe you can make sense of it otherwise @inknos or @jwhonce might be able to advise on that?

@timcoding1988
Copy link
Collaborator Author

@Luap99 1. https://docs.fedoraproject.org/en-US/fedora/latest/release-notes/developers/#python-3-14 related to fedora 43. also i did make change on automation_images duo to failure log.

@inknos
Copy link
Collaborator

inknos commented Nov 13, 2025

Looks like something in pthon changed?

has to be python3.13, taking a look

edit: oh wow, 3.14, wasn't expecting it

@inknos
Copy link
Collaborator

inknos commented Nov 13, 2025

From
https://docs.python.org/dev/whatsnew/3.14.html#changes-in-the-python-api

If you encounter NameErrors or pickling errors coming out of multiprocessing or concurrent.futures, see the forkserver restrictions.

which leads us to this
https://docs.python.org/dev/library/multiprocessing.html#multiprocessing-programming-forkserver

It's likely to be easy to reproduce and fix locally. I can try

@Luap99
Copy link
Member

Luap99 commented Nov 13, 2025

For podman diff with buildah container that can be reproduced setting export NOTIFY_SOCKET=/run/systemd/notify before running the test, I have no idea why the notify socket ends up getting leaked into cotnainer fs there though.

Easy things would be to unset the env in CI setup which is likely want we want to do anyway as it affects many other things but still I am not sure why this specific test would care about it.

The other failure podman volumes with XFS quotas seems to be related to selinux by logging at the journal logs.

I think this is the same thing have been seeing in the podman-machine-os CI, the main point here is that this testing on tmpdir root --root /tmp/CI_SBJE/podman_bats.ipexyS/root

Nov 13 13:43:18 ip-172-31-27-118.ec2.internal audit[45020]: AVC avc: denied { read } for pid=45020 comm="top" path="/bin/busybox" dev="tmpfs" ino=9753 scontext=system_u:system_r:container_t:s0:c28,c816 tcontext=system_u:object_r:container_var_run_t:s0 tclass=file permissive=0
Nov 13 13:43:18 ip-172-31-27-118.ec2.internal testctr[45018]: Error relocating /usr/bin/top: RELRO protection failed: No error information

cc @lsm5

https://api.cirrus-ci.com/v1/artifact/task/5071246569766912/html/sys-podman-fedora-43-aarch64-root-host.log.html

@inknos
Copy link
Collaborator

inknos commented Nov 13, 2025

@timcoding1988

diff --git a/test/apiv2/python/rest_api/test_v2_0_0_image.py b/test/apiv2/python/rest_api/test_v2_0_0_image.py
index e62b210ebc..8c354c67ab 100644
--- a/test/apiv2/python/rest_api/test_v2_0_0_image.py
+++ b/test/apiv2/python/rest_api/test_v2_0_0_image.py
@@ -1,6 +1,6 @@
 import json
 import unittest
-from multiprocessing import Process
+from multiprocessing import Process, set_start_method
 
 import requests
 from dateutil.parser import parse
@@ -168,6 +168,9 @@ class ImageTestCase(APITestCase):
             self.assertEqual(r.status_code, 400, f"#5: {r.text}")
 
         i = 1
+        # Need to explicitely set start method
+        # https://docs.python.org/dev/library/multiprocessing.html#contexts-and-start-methods
+        set_start_method('fork')
         for fn in [do_search1, do_search2, do_search3, do_search4, do_search5]:
             with self.subTest(i=i):
                 search = Process(target=fn)

this fixed the python error for me. I am forcing the spawn method to fork as it should be the default on linux, however, on windows and macos the default is spawn, so I am not sure if we need to conditionally fix that. but on linux that seems to be right

@Luap99
Copy link
Member

Luap99 commented Nov 13, 2025

For podman diff with buildah container that can be reproduced setting export NOTIFY_SOCKET=/run/systemd/notify before running the test, I have no idea why the notify socket ends up getting leaked into cotnainer fs there though.

Ok I had a lock and it seems buildah leaks the NOTIFY_SOCKET env into crun and crun then forwards that socket into the container? And as such it creates the inodes on the rootfs thus making them appear in the final image that the test diffs.
I assume this is expected behavior but maybe @nalind and/or @giuseppe could confirm this?

@timcoding1988 So in this case we should indeed make sure NOTIFY_SOCKET is unset as part of the cirrus CI setup scripts as it could interact weirdly with other tests as well.

@giuseppe
Copy link
Member

Ok I had a lock and it seems buildah leaks the NOTIFY_SOCKET env into crun and crun then forwards that socket into the container? And as such it creates the inodes on the rootfs thus making them appear in the final image that the test diffs. I assume this is expected behavior but maybe @nalind and/or @giuseppe could confirm this?

@timcoding1988 So in this case we should indeed make sure NOTIFY_SOCKET is unset as part of the cirrus CI setup scripts as it could interact weirdly with other tests as well.

yes that is expected as the container payload will use NOTIFY_SOCKET to notify when it is ready.

Easy things would be to unset the env in CI setup which is likely want we want to do anyway as it affects many other things but still I am not sure why this specific test would care about it.

and I think this is the correct thing to do, if you are not using the systemd notify API, there is no reason to pass the env variable

@timcoding1988 timcoding1988 force-pushed the new_image_sfx_for_fedora_43 branch from 09e99ed to a51bde0 Compare November 14, 2025 17:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants