Skip to content

Commit

Permalink
Merge branch 'main' into feature/allow-delta-drop-columns
Browse files Browse the repository at this point in the history
  • Loading branch information
Jeremynadal33 authored Aug 9, 2024
2 parents f3f7479 + 2124423 commit 6ba88de
Show file tree
Hide file tree
Showing 6 changed files with 14 additions and 5 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- name: Check out the repository
Expand Down Expand Up @@ -173,7 +173,7 @@ jobs:
fail-fast: false
matrix:
os: [ubuntu-latest, macos-12, windows-latest]
python-version: ["3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- name: Set up Python ${{ matrix.python-version }}
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/release-prep.yml
Original file line number Diff line number Diff line change
Expand Up @@ -448,7 +448,7 @@ jobs:
strategy:
fail-fast: false
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11"]
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- name: Check out the repository
Expand Down
1 change: 1 addition & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -28,6 +28,7 @@ repos:
- --target-version=py39
- --target-version=py310
- --target-version=py311
- --target-version=py312
additional_dependencies: [flaky]

- repo: https://github.com/pycqa/flake8
Expand Down
7 changes: 7 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,13 @@ rm -rf ./.hive-metastore/
rm -rf ./.spark-warehouse/
```

#### Additional Configuration for MacOS

If installing on MacOS, use `homebrew` to install required dependencies.
```sh
brew install unixodbc
```

### Reporting bugs and contributing code

- Want to report a bug or request a feature? Let us know on [Slack](http://slack.getdbt.com/), or open [an issue](https://github.com/fishtown-analytics/dbt-spark/issues/new).
Expand Down
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
pyhive[hive_pure_sasl]~=0.7.0
requests>=2.28.1

pyodbc~=4.0.39 --no-binary pyodbc
pyodbc~=5.1.0 --no-binary pyodbc
sqlparams>=3.0.0
thrift>=0.13.0
pyspark>=3.0.0,<4.0.0
Expand Down
3 changes: 2 additions & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def _get_plugin_version_dict():
package_version = "1.9.0a1"
description = """The Apache Spark adapter plugin for dbt"""

odbc_extras = ["pyodbc~=4.0.39"]
odbc_extras = ["pyodbc~=5.1.0"]
pyhive_extras = [
"PyHive[hive_pure_sasl]~=0.7.0",
"thrift>=0.11.0,<0.17.0",
Expand Down Expand Up @@ -87,6 +87,7 @@ def _get_plugin_version_dict():
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
"Programming Language :: Python :: 3.11",
"Programming Language :: Python :: 3.12",
],
python_requires=">=3.8",
)

0 comments on commit 6ba88de

Please sign in to comment.