Skip to content

Passing SparseVector to model predictor not supported #1860

@kapilkd13

Description

@kapilkd13

ISSUE

I trained Factorization machine model on sagemaker with MXNET. Now for training I followed this notebook . My train data is sparse(300 M rows, 4M columns) I used smac.write_spmatrix_to_sparse_tensor to write my sparse training data to s3 for training. I was able to train and deploy an endpoint for inference. But problem starts when I try to inference, my feature dimension is 4 Million. So when I try to pass a dense vector of this size to predict for inference, I get Request Entity Too Large. On checking I found that sagemaker has 5mb size limit. only option I can think of is to somehow pass the sparse vector but predict is not accepting that. Can you guys help?
I tried sending sparse tensor, with content-type as protobuf, didn't work.
I am using RealTimePredictor class as predictor_cls, is there anything I can do with this class so that it accept sparse vector and maybe convert it on server side if needed. Any Suggestions?

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions