Skip to content
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
0890d90
Miniforge 24.3.0-0 with Python 3.10.14 + install of Python 3.10.18 in…
vtaskow Oct 10, 2025
7ef90ec
Remove install of latest 3.10.X python inside the conda dockerfile
vtaskow Oct 10, 2025
fccefe2
Reduce image size to 366 MB
vtaskow Oct 10, 2025
b5e5d89
Python - 3.10, updated licences, deps, fmt, tests all good. Wrappers …
vtaskow Oct 23, 2025
0d251bb
Working version for Python 3.12
vtaskow Oct 23, 2025
2e3d597
Update dev-reqs in testing/scripts; alibi requires numpy < 2.0.0 so f…
vtaskow Oct 23, 2025
2de459c
Updated versions of numpy, protobuf and similar allow for tests insid…
vtaskow Oct 29, 2025
9c9b6a7
alibi-explain-server component compatibility with python 3.12: tests …
vtaskow Nov 5, 2025
0cf5d45
alibi-explain-server: fix mypy and run make fmt and make lint
vtaskow Nov 6, 2025
b74aabd
wrappers:s2i:python - fix several vulnerabilities in Dockerfile.conda…
vtaskow Nov 6, 2025
fec447c
wrappers:s2i:python - replace the ensurepip package in conda env with…
vtaskow Nov 7, 2025
329e5ad
alibi-explain-server - install poetry with conda pip, replace pip wit…
vtaskow Nov 7, 2025
6c6d342
Upgrade tensorflow from 2.17.1 to 2.18.1 because of upgrade of protob…
vtaskow Nov 10, 2025
f35ff52
Upgrade xgboost to 1.7.6 so it's easy to install on macOS as well
tyndria Nov 11, 2025
d29c3df
Upgrade tf, tf-keras, protobuf, grpcio-tools in alibi-explain server
tyndria Nov 11, 2025
5829378
Temporaryly remove failing commands n testing/scripts/Makefile
tyndria Nov 11, 2025
c0990d5
Upgrade mlserver and grpcio-tools in the testing/scripts/dev_requirem…
tyndria Nov 11, 2025
6e5f5af
Upgrade tensorflow in the testing/scripts dev_requirements
tyndria Nov 11, 2025
a3defe9
Remove building alibi-detect-server for now
tyndria Nov 11, 2025
db676ab
Change base image python for few models for tests
tyndria Nov 11, 2025
0180bf2
replace seldon-core-s2i-python38 image with python 3.12 for another m…
tyndria Nov 11, 2025
3e68c7d
Fixing nbconvert upgrade
tyndria Nov 11, 2025
fbda260
Hacky fix for missing get_ipython with new nbconvert
tyndria Nov 11, 2025
26975c3
Run notebook with ipython by default
tyndria Nov 11, 2025
20a34e6
Fixing keda-operator failing pod
tyndria Nov 12, 2025
86f0e7f
Fix applying keda manifest
tyndria Nov 12, 2025
8cdb56a
python-sdk: match protobuf version with one in alibi-explain-server
vtaskow Nov 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions components/alibi-detect-server/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
_seldon_core/.tox
2 changes: 1 addition & 1 deletion components/alibi-detect-server/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ SHELL := /bin/bash
VERSION ?= $(shell cat ../../version.txt)
DOCKER_REGISTRY ?= seldonio

BASE_IMAGE ?= ${DOCKER_REGISTRY}/conda-ubi8
BASE_IMAGE ?= ${DOCKER_REGISTRY}/conda-ubi9
IMAGE ?= ${DOCKER_REGISTRY}/alibi-detect-server
KIND_NAME ?= kind

Expand Down
8 changes: 4 additions & 4 deletions components/alibi-detect-server/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
[tool.poetry]
name = "adserver"
version = "1.18.0"
version = "1.19.0-dev"
description = "Model Explanation Server"
authors = ["Seldon Technologies Ltd. <[email protected]>"]
license = "Business Source License 1.1"

[tool.poetry.dependencies]
python = ">=3.8,<3.11"
python = ">=3.12, <3.13.0"
alibi-detect = {version = "^0.11.4", extras = ["all"]}
seldon-core = {path = "_seldon_core", develop = false}
numpy = "*"
cloudevents = "1.2.0"
tensorflow = "^2.12.0"
scikit-learn = "0.24.2"
tensorflow = "2.17.1"
scikit-learn = "1.7.2"

elasticsearch = "7.9.1"

Expand Down
18 changes: 10 additions & 8 deletions components/alibi-explain-server/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -25,23 +25,22 @@ WORKDIR /microservice

# Install Poetry
ENV POETRY_HOME /microservice/.poetry
RUN curl -sSL https://install.python-poetry.org | python3 - --version 1.1.15
RUN /opt/conda/bin/pip install --no-cache-dir "poetry==2.1.2"

# Replace vulnerable pip wheel embedded inside virtualenv(CVE-2025-8869)
RUN find /opt/conda/lib/python3.12/site-packages/virtualenv/seed/wheels/embed/ -name "pip-*.whl" -delete && \
/opt/conda/bin/pip wheel pip==25.3.0 --wheel-dir /opt/conda/lib/python3.12/site-packages/virtualenv/seed/wheels/embed/

ENV PATH "$POETRY_HOME/bin:$PATH"
ENV POETRY_VIRTUALENVS_CREATE false

# Install the server
# Install the dependencies only
COPY poetry.lock pyproject.toml ./
## Disable Poetry's new installer to avoid JSONDecodeError
## https://github.com/python-poetry/poetry/issues/4210
## NOTE: Removing explicitly requirements.txt file from subdeps test
## dependencies causing false positives in Snyk.
RUN poetry config experimental.new-installer false && \
poetry install && \
rm ~/.cache/pip -rf && \
rm -f /opt/conda/lib/python3.8/site-packages/gslib/vendored/boto/requirements.txt \
/opt/conda/lib/python3.8/site-packages/gslib/vendored/oauth2client/docs/requirements.txt \
/opt/conda/lib/python3.8/site-packages/spacy/tests/package/requirements.txt
RUN poetry install --no-root


# Add licences
Expand All @@ -58,6 +57,9 @@ RUN python -m spacy download en_core_web_md
COPY alibiexplainer alibiexplainer
COPY README.md README.md

# Install the project code
RUN poetry install

FROM base as final
WORKDIR /microservice

Expand Down
8 changes: 4 additions & 4 deletions components/alibi-explain-server/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ SHELL := /bin/bash
VERSION ?= $(shell cat ../../version.txt)
DOCKER_REGISTRY ?= seldonio

BASE_IMAGE ?= ${DOCKER_REGISTRY}/conda-ubi8
BASE_IMAGE ?= ${DOCKER_REGISTRY}/conda-ubi9
IMAGE ?= ${DOCKER_REGISTRY}/alibiexplainer
KIND_NAME ?= kind

Expand All @@ -27,7 +27,7 @@ build_apis: get_apis
--grpc_python_out=./ \
--mypy_out=./ \
./proto/prediction.proto
sed -i "s/from proto/from alibiexplainer.proto/g" alibiexplainer/proto/prediction_pb2_grpc.py
perl -0pi -e 's/from proto/from alibiexplainer.proto/g' alibiexplainer/proto/prediction_pb2_grpc.py

dev_install:
poetry install
Expand All @@ -36,7 +36,7 @@ test: #type_check
poetry run pytest -v -W ignore

type_check:
mypy --ignore-missing-imports alibiexplainer --exclude proto
mypy --ignore-missing-imports --check-untyped-defs alibiexplainer --exclude proto

lint: type_check
isort --profile black --check . --skip proto --skip .eggs --skip .tox
Expand All @@ -47,7 +47,7 @@ fmt:
black . --exclude "(proto|.eggs|.tox)"

docker-build:
docker build --file=Dockerfile --build-arg BASE_IMAGE=${BASE_IMAGE} --build-arg VERSION=${VERSION} -t ${IMAGE}:${VERSION} .
docker build --platform=linux/amd64 --file=Dockerfile --build-arg BASE_IMAGE=${BASE_IMAGE} --build-arg VERSION=${VERSION} -t ${IMAGE}:${VERSION} .

docker-build-gpu:
docker build --file=Dockerfile.gpu -t ${IMAGE}-gpu:${VERSION} .
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ def __init__(
predict_fn: Callable,
explainer: Optional[alibi.explainers.AnchorText],
spacy_language_model: str = "en_core_web_md",
**kwargs
**kwargs,
):
self.predict_fn = predict_fn
if EXPLAIN_RANDOM_SEED == "True" and str(EXPLAIN_RANDOM_SEED_VALUE).isdigit():
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,12 +18,14 @@
# and since modified
#

from typing import Dict, List, Optional
from typing import Any, List, Optional

from alibi.api.interfaces import Explanation


class ExplainerWrapper(object):
def validate(self, training_data_url: Optional[str]):
def validate(self, training_data_url: Optional[str]) -> Optional[Any]:
pass

def explain(self, inputs: List) -> Dict:
def explain(self, inputs: List) -> Explanation:
pass
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ def __init__(
internal_batch_size: int = 100,
method: str = "gausslegendre",
layer: Optional[int] = None,
**kwargs
**kwargs,
):
if keras_model is None:
raise Exception("Integrated Gradients requires a Keras model")
Expand Down
Loading