Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove core #229

Closed
wants to merge 23 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
28 changes: 1 addition & 27 deletions .github/scripts/check_diff.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,6 @@


LANGCHAIN_DIRS = [
"libs/core",
"libs/text-splitters",
"libs/langchain",
"libs/community",
"libs/experimental",
Expand Down Expand Up @@ -99,12 +97,7 @@ def add_dependents(dirs_to_eval: Set[str], dependents: dict) -> List[str]:


def _get_configs_for_single_dir(job: str, dir_: str) -> List[Dict[str, str]]:
if dir_ == "libs/core":
return [
{"working-directory": dir_, "python-version": f"3.{v}"}
for v in range(8, 13)
]
min_python = "3.8"
min_python = "3.9"
max_python = "3.12"

# custom logic for specific directories
Expand Down Expand Up @@ -188,30 +181,11 @@ def _get_configs_for_multi_dirs(
found = True
if found:
dirs_to_run["extended-test"].add(dir_)
elif file.startswith("libs/standard-tests"):
# TODO: update to include all packages that rely on standard-tests (all partner packages)
# note: won't run on external repo partners
dirs_to_run["lint"].add("libs/standard-tests")
dirs_to_run["test"].add("libs/partners/mistralai")
dirs_to_run["test"].add("libs/partners/openai")
dirs_to_run["test"].add("libs/partners/anthropic")
dirs_to_run["test"].add("libs/partners/fireworks")
dirs_to_run["test"].add("libs/partners/groq")

elif file.startswith("libs/cli"):
# todo: add cli makefile
pass
elif file.startswith("libs/streamlit_agent"):
pass
elif file.startswith("libs/partners"):
partner_dir = file.split("/")[2]
if os.path.isdir(f"libs/partners/{partner_dir}") and [
filename
for filename in os.listdir(f"libs/partners/{partner_dir}")
if not filename.startswith(".")
] != ["README.md"]:
dirs_to_run["test"].add(f"libs/partners/{partner_dir}")
# Skip if the directory was deleted or is just a tombstone readme
elif file.startswith("libs/"):
raise ValueError(
f"Unknown lib: {file}. check_diff.py likely needs "
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/_pydantic_compatibility.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@ jobs:
strategy:
matrix:
python-version:
- "3.8"
- "3.9"
- "3.10"
- "3.11"
Expand Down
3 changes: 1 addition & 2 deletions .github/workflows/_pypi_upload.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ on:
required: true
type: choice
options:
- gigachain-core
- langchain-core
- gigachain-community
publish:
description: 'Publish distrib to pypi'
Expand All @@ -22,7 +22,6 @@ on:
default: false

env:
build-dir-gigachain-core: "libs/core"
build-dir-gigachain-community: "libs/community"

jobs:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/_test_doc_imports.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ jobs:

- name: Install langchain editable
run: |
poetry run pip install -e libs/core libs/langchain libs/community libs/experimental
poetry run pip install -e libs/langchain libs/community libs/experimental

- name: Check doc imports
shell: bash
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/scheduled_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
fail-fast: false
matrix:
python-version:
- "3.8"
- "3.9"
- "3.11"
working-directory:
- "libs/partners/openai"
Expand Down
12 changes: 6 additions & 6 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -67,15 +67,15 @@ spell_fix:

## lint: Run linting on the project.
lint lint_package lint_tests:
poetry run ruff check docs templates cookbook
poetry run ruff format docs templates cookbook --diff
poetry run ruff check --select I docs templates cookbook
git grep 'from langchain import' docs/docs templates cookbook | grep -vE 'from langchain import (hub)' && exit 1 || exit 0
poetry run ruff check docs cookbook
poetry run ruff format docs cookbook --diff
poetry run ruff check --select I docs cookbook
git grep 'from langchain import' docs/docs cookbook | grep -vE 'from langchain import (hub)' && exit 1 || exit 0

## format: Format the project files.
format format_diff:
poetry run ruff format docs templates cookbook
poetry run ruff check --select I --fix docs templates cookbook
poetry run ruff format docs cookbook
poetry run ruff check --select I --fix docs cookbook

## resolve_lock_conflicts: Resolve all git conflicts in all poetry.lock files in all directories and add this files to the git index
resolve_lock_conflicts:
Expand Down
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ pip install gigachain-community

```py
from langchain_core.messages import HumanMessage, SystemMessage
from langchain_community.chat_models.gigachat import GigaChat
from langchain_gigachat.chat_models.gigachat import GigaChat

# Авторизация в GigaChat
llm = GigaChat(
Expand Down
2 changes: 1 addition & 1 deletion cookbook/gigachat_stop_sequence.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@
},
"outputs": [],
"source": [
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"\n",
"llm = GigaChat(\n",
" model=\"GigaChat-Pro\",\n",
Expand Down
2 changes: 1 addition & 1 deletion cookbook/gigachat_vision/gigachat_vision.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
},
"outputs": [],
"source": [
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"\n",
"llm = GigaChat(\n",
" verify_ssl_certs=False,\n",
Expand Down
4 changes: 2 additions & 2 deletions cookbook/smart_llm.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -40,8 +40,8 @@
"outputs": [],
"source": [
"from langchain.prompts import PromptTemplate\n",
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_experimental.smart_llm import SmartLLMChain"
"from langchain_experimental.smart_llm import SmartLLMChain\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion cookbook_ru/yandex_search/retriever.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@
"from textwrap import dedent\n",
"\n",
"from IPython.display import Markdown\n",
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"from langchain_community.retrievers.yandex_search import YandexSearchAPIRetriever\n",
"from langchain_community.utilities.yandex_search import YandexSearchAPIWrapper\n",
"from langchain_core.output_parsers import StrOutputParser\n",
Expand Down
2 changes: 1 addition & 1 deletion cookbook_ru/yandex_search/tool.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
"\n",
"from IPython.display import Markdown\n",
"from langchain.agents import AgentExecutor, create_tool_calling_agent\n",
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"from langchain_community.tools.yandex_search import YandexSearchResults\n",
"from langchain_community.utilities.yandex_search import YandexSearchAPIWrapper\n",
"from langchain_core.chat_history import InMemoryChatMessageHistory\n",
Expand Down
3 changes: 1 addition & 2 deletions docs/api_reference/create_api_rst.py
Original file line number Diff line number Diff line change
Expand Up @@ -492,7 +492,6 @@ def _package_dir(package_name: str = "langchain") -> Path:
"langchain",
"experimental",
"community",
"core",
"cli",
"text-splitters",
):
Expand Down Expand Up @@ -534,7 +533,7 @@ def _build_index(dirs: List[str]) -> None:
"aws": "AWS",
"ai21": "AI21",
}
ordered = ["core", "langchain", "text-splitters", "community", "experimental"]
ordered = ["langchain", "text-splitters", "community", "experimental"]
main_ = [dir_ for dir_ in ordered if dir_ in dirs]
integrations = sorted(dir_ for dir_ in dirs if dir_ not in main_)
doc = """# LangChain Python API Reference
Expand Down
8 changes: 0 additions & 8 deletions docs/docs/contributing/code/setup.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -77,14 +77,6 @@ There are also [integration tests and code-coverage](/docs/contributing/testing/

If you are only developing `langchain_core` or `langchain_experimental`, you can simply install the dependencies for the respective projects and run tests:

```bash
cd libs/core
poetry install --with test
make test
```

Or:

```bash
cd libs/experimental
poetry install --with test
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/how_to/embed_text.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ import TabItem from '@theme/TabItem';
To start we'll need to install the OpenAI partner package:

```bash
pip install gigachain-openai
pip install langchain-openai
```

Accessing the API requires an API key, which you can get by creating an account and heading [here](https://platform.openai.com/account/api-keys). Once we have a key we'll want to set it as an environment variable by running:
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/graphs/memgraph.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@
"metadata": {},
"outputs": [],
"source": [
"pip install gigachain_community gigachain-openai neo4j gqlalchemy --user"
"pip install gigachain_community langchain-openai neo4j gqlalchemy --user"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/graphs/ontotext.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@
"pip install jupyter==1.0.0\n",
"pip install openai==1.6.1\n",
"pip install rdflib==7.0.0\n",
"pip install gigachain-openai==0.0.2\n",
"pip install langchain-openai==0.0.2\n",
"pip install langchain>=0.1.5\n",
"```\n",
"\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/platforms/microsoft.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ All functionality related to `Microsoft Azure` and other `Microsoft` products.
>[Azure OpenAI](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/) is an `Azure` service with powerful language models from `OpenAI` including the `GPT-3`, `Codex` and `Embeddings model` series for content generation, summarization, semantic search, and natural language to code translation.

```bash
pip install gigachain-openai
pip install langchain-openai
```

Set the environment variables to get access to the `Azure OpenAI` service.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/platforms/openai.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ All functionality related to OpenAI

Install the integration package with
```bash
pip install gigachain-openai
pip install langchain-openai
```

Get an OpenAI api key and set it as an environment variable (`OPENAI_API_KEY`)
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/retrievers/elastic_qna.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -165,7 +165,7 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"\n",
"user = getpass.getpass(\"Giga user:\")\n",
"password = getpass.getpass(\"Giga password:\")"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet gigachain-core databricks-vectorsearch langchain-openai tiktoken"
"%pip install --upgrade --quiet langchain-core databricks-vectorsearch langchain-openai tiktoken"
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ import useBaseUrl from '@docusaurus/useBaseUrl';

Concretely, the framework consists of the following open-source libraries:

- **`gigachain-core`**: Base abstractions and LangChain Expression Language.
- **`langchain-core`**: Base abstractions and LangChain Expression Language.
- **`gigachain-community`**: Third party integrations.
- Partner packages (e.g. **`gigachain-openai`**, etc.): Some integrations have been further split into their own lightweight packages that only depend on **`langchain-core`**.
- Partner packages (e.g. **`langchain-openai`**, etc.): Some integrations have been further split into their own lightweight packages that only depend on **`langchain-core`**.
- **`gigachain`**: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.
- **[langgraph](/docs/langgraph)**: Build robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph.
- **[langserve](/docs/langserve)**: Deploy LangChain chains as REST APIs.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/tutorials/classification.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet gigachain gigachain-openai\n",
"%pip install --upgrade --quiet gigachain langchain-openai\n",
"\n",
"# Set env var OPENAI_API_KEY or load from a .env file:\n",
"# import dotenv\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/versions/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ To understand why we think breaking the dependency of `langchain` on `langchain-

```toml
python = ">=3.8.1,<4.0"
langchain-core = "^0.2.0"
langchain-core = "^0.2.41"
langchain-text-splitters = ">=0.0.1,<0.1"
langsmith = "^0.1.17"
pydantic = ">=1,<3"
Expand Down
2 changes: 1 addition & 1 deletion docs/docs_ru/cookbook/chains/retrieve.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -510,7 +510,7 @@
"outputs": [],
"source": [
"from langchain.chains import RetrievalQA\n",
"from langchain_community.llms.gigachat import GigaChat"
"from langchain_gigachat.llms.gigachat import GigaChat"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/docs_ru/cookbook/code_writing.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
"metadata": {},
"outputs": [],
"source": [
"%pip install --upgrade --quiet gigachain-core gigachain-experimental langchain-openai"
"%pip install --upgrade --quiet langchain-core gigachain-experimental langchain-openai"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/docs_ru/cookbook/extraction.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,7 @@
}
],
"source": [
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"\n",
"llm = GigaChat(\n",
" timeout=6000,\n",
Expand Down
4 changes: 2 additions & 2 deletions docs/docs_ru/cookbook/gigachat_functions_agent.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain_community.chat_models.gigachat import GigaChat\n",
"from langchain_gigachat.chat_models.gigachat import GigaChat\n",
"\n",
"giga = GigaChat(\n",
" credentials=\"ВАШИ_АВТОРИЗАЦИОННЫЕ_ДАННЫЕ\", model=\"GigaChat-Pro\", timeout=30\n",
Expand Down Expand Up @@ -715,7 +715,7 @@
"source": [
"import json\n",
"\n",
"from langchain_core.utils.function_calling import convert_to_gigachat_function\n",
"from langchain_gigachat.tools.gigachat_tools import convert_to_gigachat_function\n",
"\n",
"print(\n",
" json.dumps(\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs_ru/cookbook/gigachat_qa.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -164,8 +164,8 @@
"outputs": [],
"source": [
"from chromadb.config import Settings\n",
"from langchain_community.embeddings.gigachat import GigaChatEmbeddings\n",
"from langchain_community.vectorstores import Chroma\n",
"from langchain_gigachat.embeddings.gigachat import GigaChatEmbeddings\n",
"\n",
"embeddings = GigaChatEmbeddings(\n",
" credentials=\"... авторизационные данные ...\", verify_ssl_certs=False\n",
Expand Down
10 changes: 5 additions & 5 deletions docs/docs_ru/ru/gigachain/concepts.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';

Фреймворк GigaChain состоит из нескольких пакетов.

### `gigachain-core`
### `langchain-core`

Пакет содержит базовые абстракции различных компонентов и способы их объединения.
Здесь определены интерфейсы для основных компонентов:LLM, векторных хранилищ, ретриверов и других.
Expand All @@ -18,7 +18,7 @@ import useBaseUrl from '@docusaurus/useBaseUrl';

### Пакеты популярных сервисов

Тогда как полный список интеграций содержится в `gigachain-community`, интеграции с популярными сервисами выделены в собственные пакеты (например, `gigachain-openai`, `gigachain-anthropic` и т.д.).
Тогда как полный список интеграций содержится в `gigachain-community`, интеграции с популярными сервисами выделены в собственные пакеты (например, `langchain-openai`, `gigachain-anthropic` и т.д.).

### `gigachain`

Expand Down Expand Up @@ -117,7 +117,7 @@ GigaGraph предоставляет высокоуровневые интерф
- `ainvoke`: вызов цепочки с входными данными асинхронно
- `abatch`: вызов цепочки со списком входных данных асинхронно
- `astream_log`: потоковая передача промежуточных шагов по мере их выполнения, в дополнение к окончательному ответу
- `astream_events`: **beta** потоковая передача событий по мере их выполнения в цепочке (введено в `gigachain-core` версии 0.1.14)
- `astream_events`: **beta** потоковая передача событий по мере их выполнения в цепочке (введено в `langchain-core` версии 0.1.14)

Типы входных и выходных данных варьируются в зависимости от компонента:

Expand Down Expand Up @@ -493,7 +493,7 @@ GigaChain предоставляет систему колбэков, котор

#### Обработчики колбэков

`CallbackHandlers` — это объекты, которые реализуют интерфейс [`CallbackHandler`](https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackHandler.html#gigachain-core-callbacks-base-basecallbackhandler), включающий метод для каждого события, на которое можно подписаться.
`CallbackHandlers` — это объекты, которые реализуют интерфейс [`CallbackHandler`](https://api.python.langchain.com/en/latest/callbacks/langchain_core.callbacks.base.BaseCallbackHandler.html#langchain-core-callbacks-base-basecallbackhandler), включающий метод для каждого события, на которое можно подписаться.
При получении события объект `CallbackManager` вызывает соответствующий метод каждого обработчика.

```python
Expand Down Expand Up @@ -626,7 +626,7 @@ GigaChain предоставляет стандартизированный ин
### Разделение текста

GigaChain предлагает множество различных типов разделителей текста.
Все они содержатся в пакете `gigachain-text-splitters`.
Все они содержатся в пакете `langchain-text-splitters`.

Колонки таблицы:

Expand Down
Loading
Loading