Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync 1.25.0 #12

Merged
merged 90 commits into from
Nov 26, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
90 commits
Select commit Hold shift + click to select a range
e9ef43d
Prepare next development version 1.24.0-SNAPSHOT
mobuchowski Oct 4, 2024
1bdefc8
Bump the client-java group in /client/java with 8 updates (#3137)
dependabot[bot] Oct 7, 2024
8ae35c8
Require Java 17 only for Spark 4.0 (#3121)
jphalip Oct 7, 2024
31e8246
Bump the integration-spark group in /integration/spark with 7 updates…
dependabot[bot] Oct 7, 2024
1286b8c
avoid tests in configurable test (#3141)
pawel-big-lebowski Oct 8, 2024
1482805
Add extra CLL tests (#3085)
jphalip Oct 9, 2024
2cbef74
Update CHANGELOG.md link to PR (#3145)
mobuchowski Oct 9, 2024
03af2c6
spark: Limit the Seq size in RddPathUtils::extract() to avoid OutOfMe…
codelixir Oct 11, 2024
bf3d4c5
test: Add integration tests for EMR (#3142)
arturowczarek Oct 11, 2024
0b9e5ed
Bump the integration-spark group in /integration/spark with 2 updates…
dependabot[bot] Oct 14, 2024
4ba3dd2
[docs] fix argos config to run on the main repo (#3154)
pawel-big-lebowski Oct 15, 2024
365427e
fix: Fix a bunch of warnings (#3158)
arturowczarek Oct 16, 2024
2e68de3
fix: Fix Kotlin capitalized function deprecation warning (#3162)
arturowczarek Oct 16, 2024
0cc6d00
use uv to manage maturin in python sqlparser interface (#3161)
mobuchowski Oct 16, 2024
040084c
fix databricks naming mismatch in documentation (#3163)
pawel-big-lebowski Oct 17, 2024
3068639
Rename Dataplex transport to GcpLineage (#3156)
ddebowczyk92 Oct 17, 2024
5b971e6
Bump cookie and express in /website (#3155)
dependabot[bot] Oct 17, 2024
386e32a
changing from 2nd wed to 3rd (#3166)
Sheeri Oct 17, 2024
65931fb
Fix/emr implicit glue catalog (#3147)
arturowczarek Oct 17, 2024
bb9b902
Bump the client-java group across 1 directory with 7 updates (#3174)
dependabot[bot] Oct 21, 2024
96c8ff6
Fix subquery column lineage. (#3170)
JDarDagran Oct 21, 2024
c1bd0ca
feature: Improve Databricks integration tests (#3176)
arturowczarek Oct 22, 2024
75230b4
Add missing javadocs for spark-extension-interfaces (causing warnings…
arturowczarek Oct 23, 2024
e491670
Fix Databricks integration running (#3178)
arturowczarek Oct 23, 2024
a967c48
refactor: Few changes in OpenLineageSparkListener (#3179)
arturowczarek Oct 23, 2024
7f0fc07
ci: use GH label to filter out jobs (#3045)
mobuchowski Oct 24, 2024
76a8948
local file system should not point to IP (#3159)
mobuchowski Oct 25, 2024
b6b4cba
Bump the client-java group in /client/java with 4 updates (#3188)
dependabot[bot] Oct 28, 2024
cacf8ab
Bump the integration-spark group across 1 directory with 8 updates (#…
dependabot[bot] Oct 28, 2024
f651a32
fix: Fix missing WEB_PORT variable where marquez-web container is use…
arturowczarek Oct 28, 2024
64b76e6
fix: databricks integration tests on CircleCI regression is now fixed…
arturowczarek Oct 28, 2024
ddb24dc
fix: fixes regression with bad Spark version in Databricks integratio…
arturowczarek Oct 29, 2024
efcbf1d
fix: Fixes names normalization. (#3053)
arturowczarek Oct 30, 2024
7717b1d
feature: improve Databricks integration tests (#3195)
arturowczarek Oct 30, 2024
50ba05b
removal: Deprecated column level visitors (#3198)
arturowczarek Oct 30, 2024
b2ccfae
Move Spark release to the machine type common to other Java release j…
mobuchowski Oct 30, 2024
0e98c8c
Add JobTypeJobFacets to dbt integration's model, test, snapshot event…
mobuchowski Oct 30, 2024
c7a88ed
test: Exit early when the Databricks integration test has unexpected …
arturowczarek Oct 31, 2024
737229a
[Fix][Integration/dbt] Parse dbt source tests (#3208)
MassyB Nov 1, 2024
f851d77
Fix edit this page link. (#3211)
JDarDagran Nov 3, 2024
38c9e7b
build(deps): bump the client-java group in /client/java with 5 update…
dependabot[bot] Nov 3, 2024
7fd8762
build(deps): bump the integration-spark group (#3212)
dependabot[bot] Nov 3, 2024
45e99cb
fix: Fix docusaurus-mdx-checker errors (#3217)
arturowczarek Nov 4, 2024
d1c97ac
Add EnvironmentVariablesRunFacet to core spec. (#3186)
JDarDagran Nov 4, 2024
56899b4
docs: Upgrade docusaurus to 3.6 (#3219)
arturowczarek Nov 5, 2024
cfe1499
Add assertions for format in test events. (#3221)
JDarDagran Nov 5, 2024
0ed2663
docs: fix outdated Spark-related docs (#3215)
mobuchowski Nov 5, 2024
22f8ecc
spark: Update Dataproc run facet to include runType property (#3167)
codelixir Nov 5, 2024
46db181
Move kinesis to separate module, move http transport to httpclient5 (…
mobuchowski Nov 5, 2024
bf4aaf0
update changelog for release 1.24.0 (#3224)
mobuchowski Nov 5, 2024
57021e5
Prepare for release 1.24.0
mobuchowski Nov 5, 2024
fcc4464
Prepare next development version 1.25.0-SNAPSHOT
mobuchowski Nov 5, 2024
ad963f9
add build context to flink integration test (#3225)
mobuchowski Nov 6, 2024
159e571
docs: Inject OpenLineage version in documentation (#3223)
arturowczarek Nov 6, 2024
8906ff3
Make `from_dict` backwards compatible in Python client. (#3226)
JDarDagran Nov 6, 2024
df11285
Prepare for release 1.24.1
mobuchowski Nov 6, 2024
6f1d77c
Prepare next development version 1.25.0-SNAPSHOT
mobuchowski Nov 6, 2024
130a5df
passthrough credentials to Flink build job (#3227)
mobuchowski Nov 6, 2024
e51beba
Prepare for release 1.24.2
mobuchowski Nov 6, 2024
49ec821
Prepare next development version 1.25.0-SNAPSHOT
mobuchowski Nov 6, 2024
549ed1d
Make edit link variable to version. (#3229)
JDarDagran Nov 8, 2024
0399867
[Integration][DBT] Fix compatibility with DBT v1.8 (#3228)
NJA010 Nov 8, 2024
41d155f
fix: Wrong Scala version in Databricks integration tests configuratio…
arturowczarek Nov 12, 2024
de844ef
Replace bitly links with permanent slack invite. (#3230)
merobi-hub Nov 12, 2024
195ecf4
Introduce InputStatisticsInputDatasetFacet (#3238)
pawel-big-lebowski Nov 12, 2024
ab3ad96
java client: remove TransportType enum (#3239)
mobuchowski Nov 12, 2024
4c1ed18
build(deps): bump the integration-spark group (#3233)
dependabot[bot] Nov 12, 2024
acbe3d1
fix configurable integration test (#3237)
pawel-big-lebowski Nov 12, 2024
1252356
Enable building input/output facets from the visitors (#3207)
pawel-big-lebowski Nov 12, 2024
2a383e1
fix: Clean project before running Databricks integration tests (#3241)
arturowczarek Nov 13, 2024
58fbd67
(proposal): New tags facet for arbitrary metadata (#3234)
leogodin217 Nov 13, 2024
61bea61
feature: Upload Databricks integration tests logs (#3242)
arturowczarek Nov 14, 2024
e1f41b2
fix: Databricks File System schema/namespace in Naming.md (#3243)
duzun Nov 14, 2024
f7942cf
Spark: Exclude META-INF/*TransportBuilder from Spark Extension Interf…
tnazarew Nov 15, 2024
fbc00b3
added proper databricks configuration to fix Databricks SparkUI issue…
algorithmy1 Nov 18, 2024
b9e6871
ci: bump xcode version (#3254)
mobuchowski Nov 18, 2024
b290136
Enable Delta 3+ version compatibility (#3253)
Jorricks Nov 19, 2024
b6862a5
Revert "ci: bump xcode version (#3254)" (#3257)
JDarDagran Nov 19, 2024
f78a961
build(deps): bump aiohttp from 3.10.2 to 3.10.11 in /dev (#3255)
dependabot[bot] Nov 19, 2024
34ad525
build(deps): bump cross-spawn from 7.0.3 to 7.0.6 in /website (#3258)
dependabot[bot] Nov 19, 2024
2130ab9
Update integration version support in README.md (#3262)
mobuchowski Nov 19, 2024
e98f55e
blog post about flink native integration (#3247)
pawel-big-lebowski Nov 20, 2024
376cfa8
collect input/output statistics from Spark jobs (#3240)
pawel-big-lebowski Nov 20, 2024
97e27bd
input/output statistics for iceberg (#3263)
pawel-big-lebowski Nov 20, 2024
61f0f06
build(deps): bump path-to-regexp from 1.8.0 to 3.3.0 in /website (#3259)
dependabot[bot] Nov 20, 2024
c452560
[Integration][DBT] Support for CLL in DBT integration (#3264)
mayurmadnani Nov 21, 2024
7e0417d
fix iceberg vendor build (#3269)
pawel-big-lebowski Nov 22, 2024
9cd5edb
update changelog for 1.25.0 release (#3273)
mobuchowski Nov 26, 2024
8e923c3
Prepare for release 1.25.0
mobuchowski Nov 26, 2024
c6e2a21
sync with upstream release tag 1.25.0
jrosend Nov 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
31 changes: 11 additions & 20 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@ setup: true
# the path of an updated fileset
orbs:
continuation: circleci/[email protected]
github-cli: circleci/[email protected]

# optional parameter when triggering to
# only run a particular type of integration
Expand Down Expand Up @@ -51,6 +52,7 @@ jobs:
- image: cimg/python:3.8
steps:
- checkout
- github-cli/setup
- run:
name: Install yq
command: |
Expand Down Expand Up @@ -150,26 +152,14 @@ jobs:
name: Remove approval steps if not pull from forks.
command: |
pip install pyyaml==6.0.1
python -c "import yaml
d = yaml.safe_load(open('complete_config.yml'))
for workflow_name, workflow_definition in d['workflows'].items():
jobs = workflow_definition.get('jobs') if isinstance(workflow_definition, dict) else None
if not jobs: continue

# find all approvals
approvals = list(filter(lambda x: isinstance(x, dict) and list(x.values())[0].get('type') == 'approval', jobs))
for approval in approvals:
approval_name = next(iter(approval))
approval_upstreams = approval[approval_name].get('requires')
approval_downstream = list(filter(lambda x: isinstance(x, dict) and approval_name in list(x.values())[0].get('requires', ''), jobs))
# replace approval with its upstream jobs
for job in approval_downstream:
requires = next(iter(job.values()))['requires']
requires.remove(approval_name)
requires.extend(approval_upstreams)
jobs.remove(approval)
with open('complete_config.yml', 'w') as f:
f.write(yaml.dump(d, sort_keys=False))"
python dev/filter_approvals.py
- run: |
export IS_FULL_TESTS=$(gh pr view --json labels | jq 'any(.labels[]; .name == "full-tests")')
echo $IS_FULL_TESTS
if [ -z "$IS_FULL_TESTS" ] || [ "$IS_FULL_TESTS" == "0" ]; then
pip install pyyaml==6.0.1
python dev/filter_matrix.py
fi
- when:
condition:
or:
Expand All @@ -194,6 +184,7 @@ workflows:
schedule_workflow:
jobs:
- determine_changed_modules:
context: pr
filters:
tags:
only: /^[0-9]+(\.[0-9]+){2}(-rc\.[0-9]+)?$/
119 changes: 59 additions & 60 deletions .circleci/continue_config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,32 @@ commands:
echo "Setting default Java to ${JAVA_BIN}"
sudo update-alternatives --set java ${JAVA_BIN}
sudo update-alternatives --set javac ${JAVAC_BIN}

set_java_spark_scala_version:
parameters:
env-variant:
type: string
description: "Set Java, Spark and Scala versions"
steps:
- run: |
set -eux
JAVA=$(echo << parameters.env-variant >> | cut -d '-' -f 1 | cut -d ':' -f 2)
SPARK=$(echo << parameters.env-variant >> | cut -d '-' -f 2 | cut -d ':' -f 2)
SCALA=$(echo << parameters.env-variant >> | cut -d '-' -f 3 | cut -d ':' -f 2)
echo spark=$SPARK java=$JAVA scala=$SCALA
JAVA8_HOME='/usr/lib/jvm/java-8-openjdk-amd64'
JAVA17_HOME='/usr/lib/jvm/java-17-openjdk-amd64'
JAVA_BIN=$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/java" || echo "$JAVA8_HOME/jre/bin/java")
JAVAC_BIN=$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/javac" || echo "$JAVA8_HOME/bin/javac")

echo 'export JAVA17_HOME="/usr/lib/jvm/java-17-openjdk-amd64"' >> "$BASH_ENV"
echo "export SPARK=\"${SPARK}\"" >> "$BASH_ENV"
echo "export JAVA=\"${JAVA}\"" >> "$BASH_ENV"
echo "export JAVA_BIN=\"${JAVA_BIN}\"" >> "$BASH_ENV"
echo "export JAVAC_BIN=\"${JAVAC_BIN}\"" >> "$BASH_ENV"
echo "export SCALA=\"${SCALA}\"" >> "$BASH_ENV"
echo "Setting default Java to ${JAVA_BIN}"
sudo update-alternatives --set java ${JAVA_BIN}
sudo update-alternatives --set javac ${JAVAC_BIN}
store_submodule_tests:
parameters:
submodule:
Expand Down Expand Up @@ -462,7 +487,7 @@ jobs:
- store_test_results:
path: build/test-results/test
- store_test_results:
path: transports-dataplex/build/test-results/test
path: transports-gcplineage/build/test-results/test
- store_test_results:
path: transports-gcs/build/test-results/test
- store_test_results:
Expand All @@ -471,7 +496,7 @@ jobs:
path: build/reports/tests/test
destination: test-report
- store_artifacts:
path: transports-dataplex/build/reports/tests/test
path: transports-gcplineage/build/reports/tests/test
destination: test-report
- store_artifacts:
path: transports-gcs/build/reports/tests/test
Expand All @@ -486,7 +511,7 @@ jobs:
path: build/libs
destination: libs
- store_artifacts:
path: transports-dataplex/build/libs
path: transports-gcplineage/build/libs
destination: libs
- store_artifacts:
path: transports-gcs/build/libs
Expand Down Expand Up @@ -517,8 +542,8 @@ jobs:
path: ./build/libs
destination: java-client-artifacts
- store_artifacts:
path: ./transports-dataplex/build/libs
destination: transports-dataplex-artifacts
path: ./transports-gcplineage/build/libs
destination: transports-gcplineage-artifacts
- store_artifacts:
path: ./transports-gcs/build/libs
destination: transports-gcs-artifacts
Expand All @@ -532,8 +557,8 @@ jobs:

release-integration-spark:
working_directory: ~/openlineage/integration/spark
docker:
- image: cimg/openjdk:17.0
machine:
image: ubuntu-2404:current
steps:
- *checkout_project_root
- run:
Expand All @@ -544,15 +569,16 @@ jobs:
- v1-release-client-java-{{ checksum "/tmp/checksum.txt" }}
- attach_workspace:
at: ~/
- set_java_version
- run: |
# Get, then decode the GPG private key used to sign *.jar
export ORG_GRADLE_PROJECT_signingKey=$(echo $GPG_SIGNING_KEY | base64 -d)
export RELEASE_PASSWORD=$(echo $OSSRH_TOKEN_PASSWORD)
export RELEASE_USERNAME=$(echo $OSSRH_TOKEN_USERNAME)

# Publish *.jar
./gradlew --no-daemon --console=plain clean publishToSonatype closeAndReleaseSonatypeStagingRepository --info -Pscala.binary.version=2.12 -Pjava.compile.home=/usr/local/jdk-17.0.11
./gradlew --no-daemon --console=plain clean publishToSonatype closeAndReleaseSonatypeStagingRepository --info -Pscala.binary.version=2.13 -Pjava.compile.home=/usr/local/jdk-17.0.11
./gradlew --no-daemon --console=plain clean publishToSonatype closeAndReleaseSonatypeStagingRepository --info -Pscala.binary.version=2.12 -Pjava.compile.home=${JAVA17_HOME}
./gradlew --no-daemon --console=plain clean publishToSonatype closeAndReleaseSonatypeStagingRepository --info -Pscala.binary.version=2.13 -Pjava.compile.home=${JAVA17_HOME}
- store_artifacts:
path: ./build/libs
destination: spark-client-artifacts
Expand Down Expand Up @@ -620,24 +646,8 @@ jobs:
command: ./../../.circleci/checksum.sh /tmp/checksum.txt $CIRCLE_BRANCH
- attach_workspace:
at: ~/
- run:
name: Spark & Java version Variable
command: |
JAVA=$(echo << parameters.env-variant >> | cut -d '-' -f 1 | cut -d ':' -f 2)
SPARK=$(echo << parameters.env-variant >> | cut -d '-' -f 2 | cut -d ':' -f 2)
SCALA=$(echo << parameters.env-variant >> | cut -d '-' -f 3 | cut -d ':' -f 2)
echo spark=$SPARK java=$JAVA scala=$SCALA
JAVA8_HOME='/usr/lib/jvm/java-8-openjdk-amd64'
JAVA17_HOME='/usr/lib/jvm/java-17-openjdk-amd64'

echo 'export JAVA17_HOME=/usr/lib/jvm/java-17-openjdk-amd64' >> "$BASH_ENV"
echo 'export SPARK='${SPARK} >> "$BASH_ENV"
echo 'export JAVA_BIN='$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/java" || echo "$JAVA8_HOME/jre/bin/java") >> "$BASH_ENV"
echo 'export JAVAC_BIN='$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/javac" || echo "$JAVA8_HOME/bin/javac") >> "$BASH_ENV"
echo 'export SCALA='${SCALA} >> "$BASH_ENV"
echo "${JAVA}"
echo "${JAVA_BIN}"
echo "${JAVAC_BIN}"
- set_java_spark_scala_version:
env-variant: << parameters.env-variant >>
- restore_cache:
keys:
- v1-integration-spark-{{ checksum "/tmp/checksum.txt" }}
Expand Down Expand Up @@ -689,33 +699,15 @@ jobs:
- run:
name: Generate cache key
command: ./../../.circleci/checksum.sh /tmp/checksum.txt $CIRCLE_BRANCH
- run:
name: Spark & Java version Variable
command: |
JAVA=$(echo << parameters.env-variant >> | cut -d '-' -f 1 | cut -d ':' -f 2)
SPARK=$(echo << parameters.env-variant >> | cut -d '-' -f 2 | cut -d ':' -f 2)
SCALA=$(echo << parameters.env-variant >> | cut -d '-' -f 3 | cut -d ':' -f 2)
echo spark=$SPARK java=$JAVA scala=$SCALA
JAVA8_HOME='/usr/lib/jvm/java-8-openjdk-amd64'
JAVA17_HOME='/usr/lib/jvm/java-17-openjdk-amd64'

echo 'export JAVA17_HOME=/usr/lib/jvm/java-17-openjdk-amd64' >> "$BASH_ENV"
echo 'export SPARK_VERSION_VAR='${SPARK} >> "$BASH_ENV"
echo 'export SCALA='${SCALA} >> "$BASH_ENV"
echo 'export JAVA_BIN='$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/java" || echo "$JAVA8_HOME/jre/bin/java") >> "$BASH_ENV"
echo 'export JAVAC_BIN='$([ "$JAVA" = "17" ] && echo "$JAVA17_HOME/bin/javac" || echo "$JAVA8_HOME/bin/javac") >> "$BASH_ENV"
echo $JAVA_BIN
- set_java_spark_scala_version:
env-variant: << parameters.env-variant >>
- run: mkdir -p app/build/gcloud && echo $GCLOUD_SERVICE_KEY > app/build/gcloud/gcloud-service-key.json && chmod 644 app/build/gcloud/gcloud-service-key.json
- restore_cache:
keys:
- v1-integration-spark-{{ checksum "/tmp/checksum.txt" }}
- attach_workspace:
at: ~/
- run: |
echo "Setting default Java to ${JAVA_BIN}"
sudo update-alternatives --set java ${JAVA_BIN}
sudo update-alternatives --set javac ${JAVAC_BIN}
- run: ./gradlew --no-daemon --console=plain integrationTest -x test -Pspark.version=${SPARK_VERSION_VAR} -Pscala.binary.version=${SCALA} -Pjava.compile.home=${JAVA17_HOME}
- run: ./gradlew --no-daemon --console=plain integrationTest -x test -Pspark.version=${SPARK} -Pscala.binary.version=${SCALA} -Pjava.compile.home=${JAVA17_HOME}
- run: ./gradlew --no-daemon --console=plain jacocoTestReport -Pscala.binary.version=${SCALA} -Pjava.compile.home=${JAVA17_HOME}
- store_test_results:
path: app/build/test-results/integrationTest
Expand Down Expand Up @@ -846,7 +838,7 @@ jobs:

integration-test-databricks-integration-spark:
parameters:
spark-version:
env-variant:
type: string
working_directory: ~/openlineage/integration/spark
machine:
Expand All @@ -871,20 +863,23 @@ jobs:
- v1-integration-spark-{{ checksum "/tmp/checksum.txt" }}
- attach_workspace:
at: ~/
- set_java_version
- run: |
sudo update-alternatives --set java ${JAVA_BIN}
sudo update-alternatives --set javac ${JAVAC_BIN}
- run: ./gradlew --console=plain shadowJar -x test -Pjava.compile.home=${JAVA17_HOME}
- run: ./gradlew --no-daemon --console=plain databricksIntegrationTest -x test -Pspark.version=<< parameters.spark-version >> -PdatabricksHost=$DATABRICKS_HOST -PdatabricksToken=$DATABRICKS_TOKEN -Pjava.compile.home=${JAVA17_HOME}
- set_java_spark_scala_version:
env-variant: << parameters.env-variant >>
- run: mkdir -p app/build/logs
- run: mkdir -p app/build/events
- run: ./gradlew --console=plain clean shadowJar -x test -Pjava.compile.home=${JAVA17_HOME}
- run: ./gradlew --no-daemon --console=plain databricksIntegrationTest -x test -Pspark.version=${SPARK} -Pscala.binary.version=${SCALA} -Pjava.compile.home=${JAVA17_HOME} -Dopenlineage.tests.databricks.workspace.host=$DATABRICKS_HOST -Dopenlineage.tests.databricks.workspace.token=$DATABRICKS_TOKEN
- store_test_results:
path: app/build/test-results/databricksIntegrationTest
- store_artifacts:
path: app/build/reports/tests/databricksIntegrationTest
destination: test-report
- store_artifacts:
path: app/build/cluster-log4j.log
destination: cluster-log4j.log
path: app/build/logs
destination: cluster-logs
- store_artifacts:
path: app/build/events
destination: events
- save_cache:
key: v1-databricks-integration-spark-{{ checksum "/tmp/checksum.txt" }}
paths:
Expand Down Expand Up @@ -992,8 +987,12 @@ jobs:
at: ~/
- set_java_version
- run: chmod -R 777 data/iceberg/db
- run: ./gradlew --console=plain examples:stateful:build -Pflink.version=<< parameters.flink-version >>
- run: ./gradlew --no-daemon --console=plain integrationTest --i -Pflink.version=<< parameters.flink-version >>
- run: |
# Get, then decode the GPG private key used to sign *.jar
export ORG_GRADLE_PROJECT_signingKey=$(echo $GPG_SIGNING_KEY | base64 -d)
export RELEASE_PASSWORD=$(echo $OSSRH_TOKEN_PASSWORD)
export RELEASE_USERNAME=$(echo $OSSRH_TOKEN_USERNAME)
./gradlew --no-daemon --console=plain integrationTest --i -Pflink.version=<< parameters.flink-version >>
- run:
when: on_fail
command: cat app/build/test-results/integrationTest/TEST-*.xml
Expand Down
1 change: 1 addition & 0 deletions .circleci/workflows/openlineage-flink.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ workflows:
filters:
tags:
only: /^[0-9]+(\.[0-9]+){2}(-rc\.[0-9]+)?$/
context: << pipeline.parameters.build-context >>
matrix:
parameters:
flink-version: [ '1.15.4', '1.16.2', '1.17.1', '1.18.1', '1.19.0' ]
Expand Down
49 changes: 26 additions & 23 deletions .circleci/workflows/openlineage-spark.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,17 +23,17 @@ workflows:
parameters:
env-variant: [
'java:8-spark:2.4.8-scala:2.12',
'java:8-spark:3.2.4-scala:2.12',
'java:8-spark:3.2.4-scala:2.13',
'java:8-spark:3.3.4-scala:2.12',
'java:8-spark:3.3.4-scala:2.13',
'java:17-spark:3.3.4-scala:2.12',
'java:17-spark:3.3.4-scala:2.13',
'java:8-spark:3.4.3-scala:2.12',
'java:8-spark:3.4.3-scala:2.13',
'java:8-spark:3.5.2-scala:2.12',
'java:8-spark:3.5.2-scala:2.13',
'java:17-spark:3.5.2-scala:2.12',
'java:8-spark:3.2.4-scala:2.12-full-tests',
'java:8-spark:3.2.4-scala:2.13-full-tests',
'java:8-spark:3.3.4-scala:2.12-full-tests',
'java:8-spark:3.3.4-scala:2.13-full-tests',
'java:17-spark:3.3.4-scala:2.12-full-tests',
'java:17-spark:3.3.4-scala:2.13-full-tests',
'java:8-spark:3.4.3-scala:2.12-full-tests',
'java:8-spark:3.4.3-scala:2.13-full-tests',
'java:8-spark:3.5.2-scala:2.12-full-tests',
'java:8-spark:3.5.2-scala:2.13-full-tests',
'java:17-spark:3.5.2-scala:2.12-full-tests',
'java:17-spark:3.5.2-scala:2.13',
'java:17-spark:4.0.0-scala:2.13'
]
Expand Down Expand Up @@ -92,7 +92,10 @@ workflows:
context: integration-tests
matrix:
parameters:
spark-version: [ '3.4.2', '3.5.2' ]
env-variant: [
'java:8-spark:3.4.1-scala:2.12-full-tests',
'java:17-spark:3.5.0-scala:2.12-full-tests'
]
requires:
- approval-integration-spark
post-steps:
Expand All @@ -112,17 +115,17 @@ workflows:
parameters:
env-variant: [
'java:8-spark:2.4.8-scala:2.12',
'java:8-spark:3.2.4-scala:2.12',
'java:8-spark:3.2.4-scala:2.13',
'java:8-spark:3.3.4-scala:2.12',
'java:8-spark:3.3.4-scala:2.13',
'java:17-spark:3.3.4-scala:2.12',
'java:17-spark:3.3.4-scala:2.13',
'java:8-spark:3.4.3-scala:2.12',
'java:8-spark:3.4.3-scala:2.13',
'java:8-spark:3.5.2-scala:2.12',
'java:8-spark:3.5.2-scala:2.13',
'java:17-spark:3.5.2-scala:2.12',
'java:8-spark:3.2.4-scala:2.12-full-tests',
'java:8-spark:3.2.4-scala:2.13-full-tests',
'java:8-spark:3.3.4-scala:2.12-full-tests',
'java:8-spark:3.3.4-scala:2.13-full-tests',
'java:17-spark:3.3.4-scala:2.12-full-tests',
'java:17-spark:3.3.4-scala:2.13-full-tests',
'java:8-spark:3.4.3-scala:2.12-full-tests',
'java:8-spark:3.4.3-scala:2.13-full-tests',
'java:8-spark:3.5.2-scala:2.12-full-tests',
'java:8-spark:3.5.2-scala:2.13-full-tests',
'java:17-spark:3.5.2-scala:2.12-full-tests',
'java:17-spark:3.5.2-scala:2.13',
'java:17-spark:4.0.0-scala:2.13'
]
Expand Down
Loading