Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor independent cluster configs, pin pandas<2 in cluster environments #1120

Merged
merged 2 commits into from
Apr 17, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 0 additions & 23 deletions .github/cluster-upstream.yml

This file was deleted.

4 changes: 2 additions & 2 deletions .github/workflows/test-upstream.yml
Original file line number Diff line number Diff line change
Expand Up @@ -129,9 +129,9 @@ jobs:
- name: run a dask cluster
run: |
if [[ $which_upstream == "Dask" ]]; then
docker-compose -f .github/cluster-upstream.yml up -d
docker-compose -f continuous_integration/cluster/upstream.yml up -d
else
docker-compose -f .github/cluster.yml up -d
docker-compose -f continuous_integration/cluster/stable.yml up -d
fi

# periodically ping logs until a connection has been established; assume failure after 2 minutes
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,9 @@ jobs:
UPSTREAM: ${{ needs.detect-ci-trigger.outputs.triggered }}
run: |
if [[ $UPSTREAM == "true" ]]; then
docker-compose -f .github/cluster-upstream.yml up -d
docker-compose -f continuous_integration/cluster/upstream.yml up -d
else
docker-compose -f .github/cluster.yml up -d
docker-compose -f continuous_integration/cluster/stable.yml up -d
fi

# periodically ping logs until a connection has been established; assume failure after 2 minutes
Expand Down
7 changes: 7 additions & 0 deletions continuous_integration/cluster/environment.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
name: base
channels:
- conda-forge
- nodefaults
dependencies:
# dask serialization needs core libraries to be consistent on client/cluster
- pandas>=1.4.0,<2
Original file line number Diff line number Diff line change
Expand Up @@ -4,20 +4,19 @@ services:
dask-scheduler:
container_name: dask-scheduler
image: daskdev/dask:dev-py3.9
command: dask-scheduler
command: dask scheduler
ports:
- "8786:8786"
environment:
USE_MAMBA: "true"
# p2p shuffling requires pyarrow>=7.0.0
EXTRA_CONDA_PACKAGES: "pyarrow>=7.0.0"
volumes:
- ./environment.yml:/opt/app/environment.yml
dask-worker:
container_name: dask-worker
image: daskdev/dask:dev-py3.9
command: dask-worker dask-scheduler:8786
command: dask worker dask-scheduler:8786
environment:
USE_MAMBA: "true"
# TODO: remove pandas constraint once Dask images are updated
EXTRA_CONDA_PACKAGES: "cloudpickle>=2.1.0 pyarrow>=6.0.1 libstdcxx-ng>=12.1.0 pandas>=1.5.0"
volumes:
- ./environment.yml:/opt/app/environment.yml
- /tmp:/tmp
24 changes: 24 additions & 0 deletions continuous_integration/cluster/upstream.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
# Docker-compose setup used during tests
version: '3'
services:
dask-scheduler:
container_name: dask-scheduler
image: daskdev/dask:dev-py3.9
command: dask scheduler
ports:
- "8786:8786"
environment:
USE_MAMBA: "true"
EXTRA_CONDA_PACKAGES: "dask/label/dev::dask"
volumes:
- ./environment.yml:/opt/app/environment.yml
dask-worker:
container_name: dask-worker
image: daskdev/dask:dev-py3.9
command: dask worker dask-scheduler:8786
environment:
USE_MAMBA: "true"
EXTRA_CONDA_PACKAGES: "dask/label/dev::dask"
volumes:
- ./environment.yml:/opt/app/environment.yml
- /tmp:/tmp