-
Notifications
You must be signed in to change notification settings - Fork 12
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Implement PscSource, PscSink, and Table API's to close the gap on Fli…
…nk 1.15.1 API's (#45) * WIP full 1.15 upgrade * WIP sink API's done * WIP source API's * Abstracting out TransactionManager reflections and direct field access logic * WIP refactored FlinkPscInternalProducer (sink) to use abstracted logic * Revert irrelevant changes * Add javadocs * Improve javadocs * Small spacing update * Quick javadoc update * Add graceful handling of null transactionManager * Update package name * Bump version to 3.2.1-SNAPSHOT * Make flink API's use TransactionManagerUtils * Disallow >1 backend producer upon send() when PscProducer is transactional * Implement creation of backend producer immediately upon init of FlinkPscInternalProducer * WIP source changes * WIP flink metrics * WIP PscTopicUriPartitionSplitReader * Finish code changes to source and sink API's * WIP finished sink test refactoring * WIP finished table API source code refactor * WIP source test changes * WIP source/sink API's mainly done first round refactor * Finish table test imports * WIP compiles * WIP FlinkPscInternalProducerITCase * WIP PscCommitterTest * WIP finished PscCommitterTest * WIP fixed Test Sink with lower parallelism in PscSinkITCase integration tests * WIP finished PscSinkITCase integration tests * WIP finish PscSinkITCase * WIP finish PscTransactionLogITCase * WIP finish PscWriterITCase * WIP all sink tests pass * WIP finished OffsetInitializerTest * WIP finished PscSubscriberTest * WIP finish PscEnumeratorTest * WIP finish PscSourceReaderMetricsTest * WIP finish PscRecordDeserializationSchemaTest * WIP in the middle of fixing PscSourceReaderTest * WIP finish PscSourceReaderTest * WIP finished PscTopicUriPartitionSplitReaderTest * WIP finish PscSourceBuilderTest * WIP fixing PscSourceeITCase integration tests * WIP finish PscSourceITCase, need to look into BaseTopicUri.equals() and whether we can introduce logic in validate() to build the correct subclass of TopicUri * WIP finish Source and Sink tests * Fix most producer tests in streaming.connectors.psc by calling super.snapshotState() instead of supersSnapshotState(); table API and some checkpoint migration tests remaining * WIP fixing FlinkPscProducerMigrationOperatorTest * Revert "WIP fixing FlinkPscProducerMigrationOperatorTest" This reverts commit 88f402f. * WIP finish PscChangelogTableITCase; FlinkPscProducerMigrationOperatorTest still flaky but 011 so ignoring for now * WIP finish PscDynamicTableFactoryTest * WIP finish PscTableITCase * Finish table tests * Minor fixes * Catch IOException * Skip config logging in tests * Exclude kafka-schema-registry-client from flink-avro-confluent-registry in psc-flink oss * Revert "Exclude kafka-schema-registry-client from flink-avro-confluent-registry in psc-flink oss" This reverts commit 548b3f6. * Include ITCase tests in mvn surefire plugin * Convert hashmap to concurrenthashmap in TransactionManagerUtils * Remove ITCase from surefire * Disable Kafka log segment rotation for flink tests * Add retention.ms=Long.MAX_VALUE to prevent topic cleanup during test * Refactor to remove Kafka references * Add step to run ITCase tests in build * Fix mvn clean test in yaml * Fix build yaml * Set default psc.config.logging.enabled=false for OSS * Revert "Set default psc.config.logging.enabled=false for OSS" This reverts commit b4c11af. * Add default psc.conf in flink test resources * Make PscMetadataClient convert BaseTopicUri to backend-specific topicUri * Make metadataClient always convert to plaintext protocol * Add logs to debug * Revert "Make metadataClient always convert to plaintext protocol" This reverts commit 46054b5. * Make metadata client preserve protocol in describeTopicUris() * Add null check to prevent initialization NPE for byteOutMetric in PscWriter * Add NPE check for updateNumBytesInCounter * Surround NPE in PscSourceReaderMetrics with try/catch * Add logs to debug producer thread leak * Add more debug logs to see why producerPool doesn't get added * Add even more logs to debug producerLeak * Add even more logs to debug producerleak * Revert "Add even more logs to debug producerleak" This reverts commit 235c608. * Revert "Add even more logs to debug producerLeak" This reverts commit 5ab0954. * Revert "Add more debug logs to see why producerPool doesn't get added" This reverts commit a44f0b6. * Revert "Add logs to debug producer thread leak" This reverts commit c406077. * Make metrics() return ConcurrentHashMap in PscConsumer * Surround NPE in PscWriter registerMetricSync with try/catch * Remove commented pom content * Address comments * Update javadocs for PscSink and PscSource to highlight difference compared to FlinkPscProducer and Consumer * Add integration test for batch committed + javadocs * Fix javadoc
- Loading branch information
Showing
189 changed files
with
24,117 additions
and
4,913 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -22,3 +22,6 @@ buildNumber.properties | |
|
||
# vscode | ||
.vscode | ||
|
||
#test files | ||
psc-flink/orgapacheflinkutilNetUtils* |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
78 changes: 78 additions & 0 deletions
78
psc-flink/src/main/java/com/pinterest/flink/connector/psc/MetricUtil.java
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,78 @@ | ||
/* | ||
* Licensed to the Apache Software Foundation (ASF) under one or more | ||
* contributor license agreements. See the NOTICE file distributed with | ||
* this work for additional information regarding copyright ownership. | ||
* The ASF licenses this file to You under the Apache License, Version 2.0 | ||
* (the "License"); you may not use this file except in compliance with | ||
* the License. You may obtain a copy of the License at | ||
* | ||
* http://www.apache.org/licenses/LICENSE-2.0 | ||
* | ||
* Unless required by applicable law or agreed to in writing, software | ||
* distributed under the License is distributed on an "AS IS" BASIS, | ||
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
* See the License for the specific language governing permissions and | ||
* limitations under the License. | ||
*/ | ||
|
||
package com.pinterest.flink.connector.psc; | ||
|
||
import com.pinterest.psc.metrics.Metric; | ||
import com.pinterest.psc.metrics.MetricName; | ||
import org.apache.flink.annotation.Internal; | ||
import org.apache.flink.metrics.Counter; | ||
|
||
import java.util.Map; | ||
import java.util.function.Predicate; | ||
|
||
/** Collection of methods to interact with PSC's client metric system. */ | ||
@Internal | ||
public class MetricUtil { | ||
|
||
/** | ||
* Tries to find the PSC {@link Metric} in the provided metrics. | ||
* | ||
* @return {@link Metric} which exposes continuous updates | ||
* @throws IllegalStateException if the metric is not part of the provided metrics | ||
*/ | ||
public static Metric getPscMetric( | ||
Map<MetricName, ? extends Metric> metrics, String metricGroup, String metricName) { | ||
return getPscMetric( | ||
metrics, | ||
e -> | ||
e.getKey().group().equals(metricGroup) | ||
&& e.getKey().name().equals(metricName)); | ||
} | ||
|
||
/** | ||
* Tries to find the PSC {@link Metric} in the provided metrics matching a given filter. | ||
* | ||
* @return {@link Metric} which exposes continuous updates | ||
* @throws IllegalStateException if no metric matches the given filter | ||
*/ | ||
public static Metric getPscMetric( | ||
Map<MetricName, ? extends Metric> metrics, | ||
Predicate<Map.Entry<MetricName, ? extends Metric>> filter) { | ||
return metrics.entrySet().stream() | ||
.filter(filter) | ||
.map(Map.Entry::getValue) | ||
.findFirst() | ||
.orElseThrow( | ||
() -> | ||
new IllegalStateException( | ||
"Cannot find PSC metric matching current filter.")); | ||
} | ||
|
||
/** | ||
* Ensures that the counter has the same value as the given Psc metric. | ||
* | ||
* <p>Do not use this method for every record because {@link Metric#metricValue()} is an | ||
* expensive operation. | ||
* | ||
* @param from PSC's {@link Metric} to query | ||
* @param to {@link Counter} to write the value to | ||
*/ | ||
public static void sync(Metric from, Counter to) { | ||
to.inc(((Number) from.metricValue()).longValue() - to.getCount()); | ||
} | ||
} |
Oops, something went wrong.