Skip to content

[Howto]: Issue with SSL certifcate in the company network #1150

@nisan221292

Description

@nisan221292

Version

VisualStudio Code extension

Operating System

Windows 10

Your question

I get when I start the main file follwoing error message, I am using in the company network so the LLM provider is opened from the restriction, but do I need to somhow defin in the software about the Zscaler certificate, if yes where:

(venv) PS C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main> python main.py
Traceback (most recent call last):
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connectionpool.py", line 466, in _make_request
self._validate_conn(conn)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connectionpool.py", line 1095, in _validate_conn
conn.connect()
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connection.py", line 730, in connect
sock_and_verified = _ssl_wrap_socket_and_match_hostname(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connection.py", line 909, in ssl_wrap_socket_and_match_hostname
ssl_sock = ssl_wrap_socket(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\util\ssl
.py", line 469, in ssl_wrap_socket
ssl_sock = ssl_wrap_socket_impl(sock, context, tls_in_tls, server_hostname)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\util\ssl
.py", line 513, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\Users\kunalin\AppData\Local\Programs\Python\Python313\lib\ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "C:\Users\kunalin\AppData\Local\Programs\Python\Python313\lib\ssl.py", line 1040, in _create
self.do_handshake()
File "C:\Users\kunalin\AppData\Local\Programs\Python\Python313\lib\ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLCertVerificationError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1122)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connectionpool.py", line 789, in urlopen
response = self._make_request(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connectionpool.py", line 490, in _make_request
raise new_e
urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1122)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\adapters.py", line 667, in send
resp = conn.urlopen(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\connectionpool.py", line 843, in urlopen
retries = retries.increment(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\urllib3\util\retry.py", line 519, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1122)')))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\main.py", line 33, in
sys.exit(run_pythagora())
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\cli\main.py", line 237, in run_pythagora
success = run(async_main(ui, db, args))
File "C:\Users\kunalin\AppData\Local\Programs\Python\Python313\lib\asyncio\runners.py", line 44, in run
return loop.run_until_complete(main)
File "C:\Users\kunalin\AppData\Local\Programs\Python\Python313\lib\asyncio\base_events.py", line 642, in run_until_complete
return future.result()
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\cli\main.py", line 226, in async_main
success = await run_pythagora_session(sm, ui, args)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\cli\main.py", line 164, in run_pythagora_session
if not await llm_api_check(ui):
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\cli\main.py", line 110, in llm_api_check
results = await asyncio.gather(*tasks)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\cli\main.py", line 85, in check_llm
client_class = BaseLLMClient.for_provider(llm_config.provider)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\llm\base.py", line 334, in for_provider
from .azure_client import AzureClient
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\llm\azure_client.py", line 5, in
from core.llm.openai_client import OpenAIClient
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\core\llm\openai_client.py", line 15, in
tokenizer = tiktoken.get_encoding("cl100k_base")
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\tiktoken\registry.py", line 86, in get_encoding
enc = Encoding(**constructor())
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\tiktoken_ext\openai_public.py", line 76, in cl100k_base
mergeable_ranks = load_tiktoken_bpe(
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\tiktoken\load.py", line 144, in load_tiktoken_bpe
contents = read_file_cached(tiktoken_bpe_file, expected_hash)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\tiktoken\load.py", line 63, in read_file_cached
contents = read_file(blobpath)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\tiktoken\load.py", line 24, in read_file
resp = requests.get(blobpath)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\api.py", line 73, in get
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\api.py", line 59, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "C:\Users\kunalin\Downloads\gpt-pilot-main\gpt-pilot-main\gpt-pilot-main\venv\lib\site-packages\requests\adapters.py", line 698, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='openaipublic.blob.core.windows.net', port=443): Max retries exceeded with url: /encodings/cl100k_base.tiktoken (Caused by SSLError(SSLCertVerificationError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:1122)')))

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions