-
Notifications
You must be signed in to change notification settings - Fork 164
feat(BA-3032): Add Hive router setup in install_halfstack() #6740
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change | ||||||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
|
|
@@ -264,6 +264,43 @@ async def etcd_get_json(self, key: str) -> Any: | |||||||||||||
|
|
||||||||||||||
| async def install_halfstack(self) -> None: | ||||||||||||||
| self.log_header("Installing halfstack...") | ||||||||||||||
|
|
||||||||||||||
| self.log_header("Generating supergraph.graphql via rover CLI...") | ||||||||||||||
|
|
||||||||||||||
| compose_cmd = [ | ||||||||||||||
| "rover", | ||||||||||||||
| "supergraph", | ||||||||||||||
| "compose", | ||||||||||||||
| "--config", | ||||||||||||||
| "configs/graphql/supergraph.yaml", | ||||||||||||||
| ] | ||||||||||||||
| output_path = "docs/manager/graphql-reference/supergraph.graphql" | ||||||||||||||
|
|
||||||||||||||
| proc = await asyncio.create_subprocess_exec( | ||||||||||||||
| *compose_cmd, | ||||||||||||||
| stdout=asyncio.subprocess.PIPE, | ||||||||||||||
| stderr=asyncio.subprocess.PIPE, | ||||||||||||||
| ) | ||||||||||||||
| stdout, stderr = await proc.communicate() | ||||||||||||||
| if proc.returncode != 0: | ||||||||||||||
| raise RuntimeError(f"Failed to compose supergraph schema:\n{stderr.decode()}") | ||||||||||||||
|
||||||||||||||
| raise RuntimeError(f"Failed to compose supergraph schema:\n{stderr.decode()}") | |
| raise RuntimeError( | |
| f"Failed to compose supergraph schema using rover CLI.\n" | |
| f"Command: {' '.join(compose_cmd)}\n" | |
| f"Error output:\n{stderr.decode()}" | |
| ) |
Copilot
AI
Nov 12, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using synchronous file I/O in an async function. Since aiofiles is already imported in this file (line 23), consider using async with aiofiles.open(output_path, 'wb') as f: await f.write(stdout) for consistency with the async context and to avoid blocking the event loop.
| with open(output_path, "wb") as f: | |
| f.write(stdout) | |
| async with aiofiles.open(output_path, "wb") as f: | |
| await f.write(stdout) |
Copilot
AI
Nov 12, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using magic number [4] to traverse parent directories is fragile and depends on the exact file structure. If the file is moved or the directory structure changes, this will break. Consider using a more robust approach like searching for a marker file (e.g., 'BUILD_ROOT') or defining the project root explicitly in the context.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The output path is a relative string path that depends on the current working directory. This could fail if the script is not run from the expected project root. Consider using
project_root / output_pathto make it absolute and consistent with the later file operations.