-
Notifications
You must be signed in to change notification settings - Fork 129
how to deploy and execute processes
Sri Harsha Boda edited this page Sep 15, 2017
·
1 revision
After making entries into metadata related to your process, you have to deploy this process in your hadoop environment.
- Click deploy button on process page corresponding to the process you want to deploy. ( Deploy button will show status regarding deployment of process, when you hover over the button.)
- You have to copy process-deploy-env.properties file and process-deploy.sh file from application repositories and put properties file in same directory from where you are running .sh file. process-deploy-env.properties process-deploy.sh .You have to provide executable permissions to every shell script.
- You have to edit process-deploy-env.properties. Set all properties based on your environment.
- In jack repository get deploy-env.properties and update it there according to your environment.
- Process Deploy main class calls deploy script based on type of container process. For Example for process type 1 it will call process-type-1.sh script.
- You have to specify path of your process-type-n.sh script in md-config.xml file under your environment within in deploy and script-path tags upto file name "process-type-".( If you don't have any environment created according to your local system just create same name as in mybatis-config.xml and copy entries from any other environment and edit it according to your local environment). process-deploy-env.properties should be in same directory from where you are running these scripts. You have to provide executable permissions to every shell script.
- You have to setup a crontab in your environment. Crontab will run process-deploy.sh script. for example */5 * * * * /home/cloudera/process-deploy.sh
- After the deployment is complete and in UI the status for the process is deployed, you have to execute.
- Download Workflow.py and flume.sh. Update Workflow.py and flume.sh according to your environment.
- Update hostname varible in Workflow.py.
- Update pathForFlumeng,pathForFlumeconf,pathForFlumeconfFile variables in flume.sh.
- To store logs for execution, you have to create log directory as mentioned in your md-config.xml. create directory and give permission for every user to write into it.
- Also update path in md-config.xml under execute and then oozie-script-path tags for Workflow.py. And under execute then standalone-script-path for flume.sh.