Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Is this truly open source? #1244

Closed
chubshaun opened this issue Mar 5, 2024 · 3 comments
Closed

Is this truly open source? #1244

chubshaun opened this issue Mar 5, 2024 · 3 comments

Comments

@chubshaun
Copy link

I'd like to get some clarification on the Bloop project around its definition of open-source.

From the original Launch HN last year:
https://news.ycombinator.com/item?id=35236275

bloop is fully open-source. Semantic search, LLM prompts, regex search and code navigation are all contained in one repo

Contrast this with the current launch page:
https://bloop.ai

Mostly open source and free to use, with enterprise edition source available

I'm also having a difficult time understanding if this program is truly "self hostable" end-to-end. Meaning if I chose to build from sources, is there a means of swapping out the GPT openai endpoint using something like Ollama/LiteLLM or other proxy for openAI?

Or is this something where Bloop has a proprietary backend server which is doing a lot of heavy lifting before it forwards that data onto the GPT4 api? I don't want to invest a lot of effort/research into getting setup locally if I can't prevent my code base from being shipped off to Bloop's servers.

@keinsell
Copy link

keinsell commented Mar 15, 2024

I think there is need for redefinition, nowadays, it's common to see “open-source” projects were of course the source code is visible, but source code is not functional without setting up a ton of third-party services, and other projects are end-to-end available to customer which can run ware on his own infrastructure.

With AI, it's a bit of tricky case, because if not leaked models from Facebook the possibility to run a LLM locally would be never a thing. A lot of people are coding on mid-MacBook Pro which cannot handle ex. llama-70b or can handle at speed that just it's not enough for search engine - Do I'm supposed to wait 30m to AI find a thing where I can do such in 5m? Hell naw.

It was rational low-hanging fruit for resolving problem, I guess when project started nobody was thinking about llama and open models and customers who will be actually using it as it's small comparing to average race car Jonny developer which barely can handle OpenAI API.

@keinsell
Copy link

The problems with things like this is hardcoding crap inside code which eventually at larger scale makes things harder to be replaced - I'm sure you can replace hard-coded stuff for sake of local LLM and use Bloop on your machine. https://github.com/search?q=repo%3ABloopAI%2Fbloop%20gpt-4&type=code (this and baseUrl)

@ggordonhall
Copy link
Contributor

We've open-sourced the LLM backend: https://github.com/BloopAI/bloop/tree/oss/server/bleep/src/llm

You can now build and run bloop with your own OpenAI API key. Check out the instructions here: https://github.com/BloopAI/bloop?tab=readme-ov-file#building-from-source

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants