Skip to content

[FEAT]: Model provider SubModel integration #4372

@OPAI-OP

Description

@OPAI-OP

What would you like to see?

Hi guys, I'm mason wong, and from SubModel.AI. We're a asia GPU cloud platform with extensive experience in the field, and we've recently launched a new LLM inference service - InstaGen. Now we're offering a free daily quota of either 1 million tokens or 200 requests for each user on a selection of opensource models.

All of our LLM service are deployed in our self-managed, large-scale T3+ data centers with ISO27001. This ensures service stability and a high level of data security for our users. Our privacy policy is straightforward: we do not store any user token data, nor do we use it for training new models.

I've submitted a pull request to integrate SubModel.AI as a new provider for anything-llm, with the goal of offering users a high-quality, affordable service. I'm looking forward to your review and any feedback you might have.

Thank you guys. Have a nice day!

PR #4367

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions