Skip to content

Commit

Permalink
jc-1663 update connectors prereq (#602)
Browse files Browse the repository at this point in the history
* jc-1663 update connectors prereq

* jc-1663 edits for peer review

* jc-1663 more edits for peer review

* jc-1663 again more edits for peer review

* jc-1663 update attributes and other fixes

* jc-1663 fixing use of attributes

* jc-1663 fixing extra spaces
  • Loading branch information
MelissaFlinn authored Nov 11, 2022
1 parent a1121be commit d604112
Show file tree
Hide file tree
Showing 26 changed files with 148 additions and 37 deletions.
4 changes: 4 additions & 0 deletions docs/_artifacts/document-attributes.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/api-designer/getting-started-api-designer/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
86 changes: 50 additions & 36 deletions docs/connectors/getting-started-connectors/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down Expand Up @@ -109,52 +113,61 @@ A *sink* connector allows you to send data from {product-kafka} to an external s
====
endif::[]

ifndef::qs[]
== Overview

{product-long-kafka} is a cloud service that simplifies the process of running Apache Kafka. Apache Kafka is an open-source, distributed, publish-subscribe messaging system for creating fault-tolerant, real-time data feeds.
[id="proc-verifying-prerequisites-for-connectors_{context}"]
== Verifying the prerequisites for using {product-long-connectors}

You can use {product-long-connectors} to configure communication between {product-kafka} instances and external services and applications. {product-long-connectors} allow you to configure how data moves from one endpoint to another without writing code.
[role="_abstract"]

The following diagram illustrates how data flows from a data source through a data source connector to a Kafka topic. And how data flows from a Kafka topic to a data sink through a data sink connector.
Before you use {product-connectors}, you must complete the following prerequisites:

[.screencapture]
.{product-long-connectors} data flow
image::connectors-diagram.png[Illustration of data flow from data source through Kafka to data sink]
* Determine which {openshift} environment to use for deploying your {product-connectors} instances.

endif::[]
* Configure {product-long-kafka} for use with {product-connectors}.

[id="proc-configuring-kafka-for-connectors_{context}"]
== Verifying that you have the prerequisites for using {product-long-connectors}
*Determining which {openshift} environment to use for deploying your {connectors} instances*

[role="_abstract"]
ifdef::qs[]
Before you can use {product-connectors}, you must complete the link:https://console.redhat.com/application-services/learning-resources?quickstart=getting-started[Getting started with {product-long-kafka}] quick start to set up the following components:
For Service Preview, you have two choices:

* *The hosted evaluation environment*

** The {connectors} instances are hosted on a multitenant {openshift-dedicated} cluster that is owned by Red Hat.
** You can create four {connectors} instances at a time.
** The evaluation environment applies 48-hour expiration windows, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/8190dc9e-249c-4207-bd69-096e5dd5bc64[Red Hat {openshift} {connectors} Service Preview evaluation guidelines^].

* *Your own trial environment*

** You have access to your own {openshift-dedicated} trial environment.
** You can create an unlimited number of {connectors} instances.
** Your {openshift-dedicated} trial cluster expires after 60 days.
** A cluster administrator must install the {product-connectors} add-on as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat {openshift} {connectors} add-on to your {openshift-dedicated} trial cluster^].

*Configuring {product-long-kafka} for use with {product-connectors}*

* A *Kafka instance* that you can use for {product-connectors}.
* A *Kafka topic* to store messages sent by data sources and make the messages available to data sinks.
* A *service account* that allows you to connect and authenticate your {connectors} instances with your Kafka instance.
* *Access rules* for the service account that defines how your {connectors} instances can access and use the topics in your Kafka instance.
endif::[]
ifndef::qs[]
Before you can use {product-connectors}, you must complete the steps in _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_ to set up the following components:
Complete the steps in _{base-url}{getting-started-url-kafka}[Getting started with {product-long-kafka}^]_ to set up the following components:
endif::[]

ifdef::qs[]
Complete the steps in the link:https://console.redhat.com/application-services/learning-resources?quickstart=getting-started[Getting started with {product-long-kafka}] quick start to set up the following components:
endif::[]

* A *Kafka instance* that you can use for {product-connectors}.
* A *Kafka topic* to store messages sent by data sources and make the messages available to data sinks.
* A *service account* that allows you to connect and authenticate your {connectors} instances with your Kafka instance.
* *Access rules* for the service account that defines how your {connectors} instances can access and use the topics in your Kafka instance.
endif::[]
* *Access rules* for the service account that define how your {connectors} instances can access and use the topics in your Kafka instance.

ifdef::qs[]
.Procedure
Make sure that you have set up the prerequisite components.

.Verification
* Is the Kafka instance listed in the Kafka instances table and is it in the *Ready* state?
* Did you verify that your service account was successfully created in the *Service Accounts* page?
* Is the Kafka instance listed in the Kafka instances table and is the Kafka instance in the *Ready* state?
* Is your service account created in the *Service Accounts* page?
* Did you save your service account credentials to a secure location?
* Are the permissions for your service account listed in the *Access* page of the Kafka instance?
* Is the Kafka topic that you created for {product-connectors} listed in the topics table of the Kafka instance?
* Is the Kafka topic that you created for {connectors} listed in the topics table of the Kafka instance?
* If you plan to use a 60-day {openshift-dedicated} trial cluster to deploy your {product-connectors} instances, has a cluster administrator added the {product-connectors} add-on to your trial cluster?

endif::[]

Expand All @@ -165,6 +178,7 @@ ifndef::qs[]
* Verify that you saved your service account credentials to a secure location.
* Verify that the permissions for your service account are listed in the *Access* page of the Kafka instance.
* Verify that the Kafka topic that you created for {product-connectors} is listed in the Kafka instance's topics table.
* If you plan to use a 60-day {openshift-dedicated} trial cluster to deploy your {product-connectors} instances, verify that a cluster administrator added the {product-connectors} add-on to your trial cluster.

endif::[]

Expand All @@ -187,7 +201,7 @@ ifndef::qs[]
endif::[]

.Procedure
. In the {product-long-connectors} web console, select *Connectors* and then click *Create {connectors} instance*.
. In the {product-long-connectors} web console, select *{connectors}* and then click *Create {connectors} instance*.
. Select the connector that you want to use for connecting to a data source.
+
You can browse through the catalog of available connectors. You can also search for a particular connector by name, and filter for sink or source connectors.
Expand All @@ -198,11 +212,11 @@ Click the card to select the connector, and then click *Next*.

. For *Kafka instance*, click the card for the {product-kafka} instance that you configured for {connectors}, and then click *Next*.

. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment. The namespace is the deployment space that hosts your {connectors} instances.
. On the *Namespace* page, the namespace that you select depends on your {openshift-dedicated} environment. The namespace is the deployment space that hosts your {connectors} instances.
+
If you're using a trial cluster in your own OpenShift Dedicated environment, select the card for the namespace that was created when a system administrator added the {connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat OpenShift Connectors add-on to your OpenShift Dedicated trial cluster^].
If you're using a trial cluster in your own {openshift-dedicated} environment, select the card for the namespace that was created when a system administrator added the {connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat {openshift} {connectors} add-on to your {openshift-dedicated} trial cluster^].
+
If you're using the evaluation OpenShift Dedicated environment, click *Register eval namespace* to provision a namespace for hosting the {connectors} instances that you create.
If you're using the evaluation {openshift-dedicated} environment, click *Register eval namespace* to provision a namespace for hosting the {connectors} instances that you create.

. Click *Next*.

Expand Down Expand Up @@ -239,10 +253,10 @@ ifdef::qs[]
* Does your source {connectors} instance generate messages?
endif::[]
ifndef::qs[]
* Verify that your source {connectors} instance generate messages.
* Verify that your source {connectors} instance generates messages.
endif::[]

.. In the OpenShift Application Services web console, select *Streams for Apache Kafka* > *Kafka Instances*.
.. In the {product-long-rhoas} web console, select *Streams for Apache Kafka* > *Kafka Instances*.
.. Click the Kafka instance that you created for connectors.
.. Click the *Topics* tab and then click the topic that you specified for your source {connectors} instance.
.. Click the *Messages* tab to see a list of `Hello World!` messages.
Expand Down Expand Up @@ -275,11 +289,11 @@ endif::[]
+
For example, select *test* and then click *Next*.

. On the *Namespace* page, the namespace that you select depends on your OpenShift Dedicated environment. The namespace is the deployment space that hosts your {connectors} instances.
. On the *Namespace* page, the namespace that you select depends on your {openshift-dedicated} environment. The namespace is the deployment space that hosts your {connectors} instances.
+
If you're using a trial cluster on your own OpenShift Dedicated environment, select the card for the namespace that was created when you added the {connectors} service to your trial cluster.
If you're using a trial cluster on your own {openshift-dedicated} environment, select the card for the namespace that was created when you added the {connectors} service to your trial cluster.
+
If you're using the evaluation OpenShift Dedicated environment, click the *eval namespace* that you created when you created the source connector.
If you're using the evaluation {openshift-dedicated} environment, click the *eval namespace* that you created when you created the source connector.

. Click *Next*.

Expand All @@ -298,7 +312,7 @@ If you're using the evaluation OpenShift Dedicated environment, click the *eval

. Review the summary of the configuration properties and then click *Create {connectors} instance*.
+
Your {connectors} instance is listed in the table of Connectors.
Your {connectors} instance is listed in the table of {connectors}.
+
After a couple of seconds, the status of your {connectors} instance changes to the *Ready* state. It consumes messages from the associated Kafka topic and sends them to the data sink (for this example, the data sink is the HTTP URL that you provided).

Expand All @@ -309,7 +323,7 @@ ifdef::qs[]
endif::[]

ifndef::qs[]
* Verify that you see HTTP POST calls with `"Hello World!!"` messages by opening a web browser tab to your custom URL for the link:https://webhook.site[webhook.site^].
* Verify that you see HTTP POST calls with `"Hello World!!"` messages. Open a web browser tab to your custom URL for the link:https://webhook.site[webhook.site^].
endif::[]


Expand Down
3 changes: 2 additions & 1 deletion docs/connectors/getting-started-connectors/quickstart.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,9 +16,10 @@ spec:
description: !snippet README.adoc#description
prerequisites:
- Complete the <a href="https://console.redhat.com/application-services/learning-resources?quickstart=getting-started">Getting started with OpenShift Streams for Apache Kafka</a> quick start.
- If you plan to use a 60-day OpenShift Dedicated trial cluster to deploy your Connectors instances, a cluster administrator must install the Connectors add-on as described in <a href="https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01">Adding the Red Hat OpenShift Connectors add-on to your OpenShift Dedicated trial cluster</a>.
introduction: !snippet README.adoc#introduction
tasks:
- !snippet/proc README.adoc#proc-configuring-kafka-for-connectors
- !snippet/proc README.adoc#proc-verifying-prerequisites-for-connectors
- !snippet/proc README.adoc#proc-creating-source-connector
- !snippet/proc README.adoc#proc-creating-sink-connector
conclusion: !snippet README.adoc#conclusion
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/access-mgmt-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/consumer-configuration-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/getting-started-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/kafka-bin-scripts-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/kafka-instance-settings/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/kcat-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/message-browsing-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/metrics-monitoring-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/nodejs-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/quarkus-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/rhoas-cli-getting-started-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
4 changes: 4 additions & 0 deletions docs/kafka/service-binding-kafka/README.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,10 @@ WARNING: This content is generated by running npm --prefix .build run generate:a
:cloud-console-url: https://console.redhat.com/
:service-accounts-url: https://console.redhat.com/application-services/service-accounts

//to avoid typos
:openshift: OpenShift
:openshift-dedicated: OpenShift Dedicated

//OpenShift Application Services CLI
:base-url-cli: https://github.com/redhat-developer/app-services-cli/tree/main/docs/
:command-ref-url-cli: commands
Expand Down
Loading

0 comments on commit d604112

Please sign in to comment.