Skip to content

Commit

Permalink
Updating my repo (#288)
Browse files Browse the repository at this point in the history
  • Loading branch information
DaneshKohina authored Nov 29, 2023
2 parents 7c272fc + b4436f3 commit 244ec10
Show file tree
Hide file tree
Showing 63 changed files with 7,571 additions and 61 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@ Potential Topics--
1. Set up
8. Unity
1. Introduction to Unity Basics
2. A Beginner's Guide for Unity UI Design

- Software Tools
1. Git
Expand Down Expand Up @@ -97,3 +98,4 @@ Potential Topics--
1. Beginner's guide to product management and becoming a successful product manager with case studies.
- Other useful resources
- Teamwork
1. Effective Leadership
19 changes: 19 additions & 0 deletions Topics/Development_Process.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@

### [Django Project Deployment: AWS, Vercel, and Railway](./Development_Process/Django_Deployment_AWS_Railway_Vercel.md)

### [Node.js Project Deployment: Azure](./Development_Process/Azure_webapp_deployment_with_nodejs.md)

### [Automated Frontend Deployment with Vercel](./Development_Process/Frontend_Automated_Deployment_Vercel.md)

### [Flask Application Deployment on Heroku](./Development_Process/Flask_App_Deployment_Heroku.md)
Expand All @@ -29,6 +31,9 @@

### [Requirements.txt](./Development_Process/Build_Requirements/Requirements_txt.md)

## Build tools
### [Introduction to ```make``` and Makefiles](./Development_Process/Introduction_to_make_and_Makefiles.md)

## React Testing Library

### [React Testing Library](./Development_Process/React_Testing_Library.md)
Expand All @@ -37,6 +42,14 @@

### [URL Sanitization](./Development_Process/URL_Sanitization.md)

## Design Decisions

### [GraphQL vs. REST: Which API type to use?](./Development_Process/GraphQL_VS_REST.md)

## Serverless Computing

### [Serverless Computing](./Development_Process/Serverless_AWS_Lambda.md)

## SOLID PRINCIPLES:

SOLID is a mnemonic acronym that represents a set of five very important software development principles which lead to code that is easier to read, maintain, and extend, leading to higher-quality software that is easier to evolve over time.
Expand Down Expand Up @@ -93,3 +106,9 @@ This is only a simplification of what "Clean Architecture" is; the topic is so v

## Code Smells
### [Code Smells](./Development_Process/Code_Smells.md)

## Prompt Engineering
### [Basics of Prompt Engineering](./Development_Process/Prompt_Engineering.md)

## Technical Documents
### [Intro to Request for Comments (RFCs)](./Development_Process/Technical_Documents/Intro_to_rfcs.md)
192 changes: 189 additions & 3 deletions Topics/Development_Process/Azure_webapp_deployment_with_nodejs.md

Large diffs are not rendered by default.

12 changes: 11 additions & 1 deletion Topics/Development_Process/Docker.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
### [Introduction](#introduction-1)
### [Installation](#installation-1)
### [Creating Dockerfiles](#creating-dockerfiles-1)
### [Using Existing Images](#using-existing-images-1)
### [Next Steps](#next-steps-1)
### [Docker Terminology](#docker-terminology-1)

Expand Down Expand Up @@ -36,6 +37,13 @@ Once you've installed Docker, to see it in action you can follow any one of thes
- [Dockerize a Flask App](https://www.freecodecamp.org/news/how-to-dockerize-a-flask-app/) (Super detailed step-by-step explanations tutorial for containerizing a flask app. I recommend this if you want to understand the process in detail)
- [Docker's official tutorial for containerizing an application](https://docs.docker.com/get-started/02_our_app/) (Can't go wrong with the official tutorial.)

## Using Existing Images

An alternative to manually creating images is to use existing images on 'Docker Hub'. The chances are for any purpose there is a docker image for you. From database images such as 'MySQL', 'MongoDB', 'PostgreSQL' and more, to usable out the box images such as 'Wordpress' for a Wordpress website and 'NextCloud' for a Google drive-esque cloud storage system. If you are unsure about creating your own image, you can always check out ones that other people have published.

For a quick tutorial on how to use an image:
- [Using the MongoDB image](./MongoDB_in_Docker.md#using-the-official-image)

### Automatic Dockerfile Generation

Since Docker is widely used, there is a lot of Dockerfile-related knowledge in ChatGPT's training data, and the AI is capable of generating Dockerfiles for most software architectures. If you want to easily containerize your app, you can use OpenAI's ChatGPT 3.5-turbo to generate the Dockerfile for you. To do this, you first need to gather a tree of your project directory for ChatGPT to better understand your project architecture (On Linux/macOS, run `tree -I node_modules` in your project directory). Then, you can ask ChatGPT using something similar to the following prompt:
Expand Down Expand Up @@ -189,9 +197,11 @@ services:
POSTGRES_DB: postgres
```

For an alternative, you can check out another example [using MongoDB](./MongoDB_in_Docker.md#setup-alongside-application).

Since the database is contained within the docker-compose network, it is perfectly secure to use the default `postgres` user and password, since it cannot be accessed through the wider internet. However, if you want to expose your database (which is not recommended), you can add the port `5432:5432` to the `db` service and use a stronger password.

If you are using any other database, you can find the docker image on [Docker Hub](https://hub.docker.com/search?q=&type=image&category=Database) and follow the instructions there. Please be sure to read the docker container's documentation carefully! Most questions regarding database images can be answered by reading the documentation.
If you are using any other database, you can find the docker image on [Docker Hub](https://hub.docker.com/search?q=&type=image&category=Database) and follow the instructions there. If you want a quick start tutorial, you can check out the tutorial for MongoDB above. Please be sure to read the docker container's documentation carefully! Most questions regarding database images can be answered by reading the documentation.

### Automatic Updates

Expand Down
158 changes: 158 additions & 0 deletions Topics/Development_Process/Github_Actions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,158 @@
# Automating Python Testing and Deployment with GitHub Actions and SSH
GitHub Actions is a powerful workflow automation tool that allows you to automate your software development workflows directly in your GitHub repository. In this guide, we'll walk through the process of setting up automated testing and deployment for a Python project using GitHub Actions and SSH.


## Table of Contents
- [Table of Contents](#table-of-contents)
- [Prerequisites](#prerequisites)
- [Setting up the remote server](#setting-up-the-remote-server)
- [Installing Dependencies](#installing-dependencies)
- [Cloning the project](#cloning-the-project)
- [Creating a virtual environment](#creating-a-virtual-environment)
- [Deployment Script](#deployment-script)
- [What is tmux and why do we need it?](#what-is-tmux-and-why-do-we-need-it)
- [Setting up the GitHub repository](#setting-up-the-github-repository)
- [Secrets](#secrets)
- [Workflow](#workflow)
- [Testing the workflow](#testing-the-workflow)
- [Further Reading](#further-reading)
- [Conclusion](#conclusion)


## Prerequisites
Before you start, ensure you have the following:
- A python project hosted on GitHub.
- Make sure your project has a requirements.txt file containing all of the project's dependencies.
- This guide assumes that your tests have been written using the pytest framework.
- A remote server with SSH access. This can be a VPS, a dedicated server, or even a Raspberry Pi.
- Keep the hostname, port, username, and private key for the remote server handy. You'll need them later.

## Setting up the remote server
First, we'll need to set up the remote server. We'll be using a VPS running Ubuntu for this guide, but the steps should be similar for other Linux distributions.

### Installing Dependencies
First, we'll need to install the dependencies for our project. We'll be using Python 3.10 for this guide, but you can use any version of Python that your project supports.

```bash
sudo apt update # Update the package index
sudo apt install python3 python3-venv python3-pip tmux # Install Python 3, venv, pip, and tmux
```

### Cloning the project
Now, we'll clone the project from GitHub. Make sure you replace the `URL` with the URL of your project and `PROJECT_NAME` with the name of your project.

```bash
git clone [URL] # Clone the project from GitHub
cd [PROJECT_NAME] # Change into the project directory
```

### Creating a virtual environment
Next, we'll create a virtual environment for our project. This will allow us to install our project's dependencies without affecting the rest of the system.

```bash
python3 -m venv venv # Create a virtual environment named "venv"
source venv/bin/activate # Activate the virtual environment
```

### Deployment Script
Next, we'll create a deployment script. This script will be run by GitHub Actions to deploy the project to the remote server. Make sure you replace `PROJECT_NAME` with the name of your project.

```bash
cd .. # Change into the parent directory (leaving the project directory)
touch deploy.sh # Create the deployment script
chmod +x deploy.sh # Make the deployment script executable
```

Then, open the deployment script in your favorite text editor and add the following code:

```bash
#!/bin/bash

cd [PROJECT_NAME] # Change into the project directory
source venv/bin/activate # Activate the virtual environment

git pull # Pull the latest changes from GitHub

pip install -r requirements.txt # Install the project's dependencies
python -m pytest # Run the project's tests

tmux kill-session -t [PROJECT_NAME] # Kill the existing tmux session
tmux new-session -d -s [PROJECT_NAME] # Create a new tmux session
tmux send-keys -t [PROJECT_NAME] "python3 main.py" ENTER # Start the project
```

The deployment script will do the following:
1) Change into the project directory
2) Activate the virtual environment
3) Pull the latest changes from GitHub
4) Install the project's dependencies
5) Run the project's tests
6) Kill the existing tmux session
7) Create a new tmux session
8) Start the project

#### What is tmux and why do we need it?
tmux is a terminal multiplexer that allows you to run multiple terminal sessions in a single window. We'll be using tmux to run the project in the background so that we can close the SSH connection without stopping the project.

## Setting up the GitHub repository
Now that we've set up the remote server, we'll need to set up the GitHub repository. We'll be using the GitHub web interface for this guide, but you can also use the GitHub CLI if you prefer.

### Secrets
First, we'll need to add our SSH credentials to the repository as secrets. This will allow us to access the remote server without exposing our credentials in the workflow file. To do this, go to the repository settings and click on "Secrets" in the sidebar. Then, click on "New repository secret" and add the following secrets:
- `SSH_HOSTNAME`: The hostname of the remote server.
- `SSH_PORT`: The SSH port for the remote server. (Default: 22)
- `SSH_USERNAME`: The username for the remote server.
- `SSH_PRIVATE_KEY`: The private key for the remote server.

### Workflow
Next, we'll need to create a workflow file. This is a [YAML file](https://docs.ansible.com/ansible/latest/reference_appendices/YAMLSyntax.html) that tells GitHub Actions what to do when certain events occur. To do this, create a new file named `.github/workflows/main.yml` and add the following code:

```yaml
name: Test and Deploy

on:
push:
branches: [ "main" ]

jobs:
build:
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v3
- name: Set up Python 3.10
uses: actions/setup-python@v3
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install pytest
python -m pip install -r requirements.txt
- name: Test with pytest
run: |
python -m pytest
- name: Deploy
uses: appleboy/[email protected]
with:
host: ${{ secrets.SSH_HOSTNAME }}
username: ${{ secrets.SSH_USERNAME }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
port: ${{ secrets.SSH_PORT }}
script: sh deploy.sh
```
This workflow will run the following steps when we push to the main branch:
1) Install Python 3.10 in a temporary environment
2) Install the project's dependencies from `requirements.txt`
3) Run the project's tests using pytest
4) Run the `deploy.sh` script on the remote server using SSH if the tests pass.

## Testing the workflow
Now that we've set up the workflow, we'll need to test it to make sure it works. To do this, push a commit to the main branch and check the Actions tab in the repository. If everything worked correctly, you should see a green checkmark next to the commit message like in [this image](https://i.imgur.com/PYAgSb9.png). If not, check the logs for the workflow to see what went wrong by clicking the commit message and then build.

## Further Reading
If you'd like to learn more about GitHub Actions, check out the [official documentation](https://docs.github.com/en/actions).

## Conclusion
In this guide, we walked through the process of setting up automated testing and deployment for a Python project using GitHub Actions and SSH. We also tested the workflow to make sure it works correctly.
95 changes: 95 additions & 0 deletions Topics/Development_Process/Gradle_build_tool.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
# The Gradle build tool for java

## Introduction
Build tools are essential in software development because they automate the process of converting our source code into executable programs. The process involves compiling code, linking resources, executing tests, and deploying the product. Without build tools, developers would have to manually execute each of these steps, which can be time-consuming and prone to human error, especially in projects with many dependencies.

Gradle is one of the most powerful and flexible build tools. Its primarily used for Java projects but can also be applied to other programming languages such as C/C++ and Python. It's designed to support complex workflows and provides a versatile way to define build logic. Unlike some other build tools, Gradle scripts are generally concise and human-readable. Gradle is also highly extensible, offering a rich API that enables developers to write custom plugins and tasks. Its incremental build capabilities save time by only running tasks that are necessary. As a result, Gradle has become a popular choice among many developers.

## Dependency management with Gradle
Dependency management is a critical feature of Gradle that allows developers to automatically download and integrate libraries and other resources that their project depends upon. Many projects organize unrelated functionality into different parts of a modular system.

Gradle allows you to define different types of dependencies. Here's an example of how you might declare dependencies in a build.gradle file, a script used in Gradle projects to define the project's build configuration, including dependencies, plugins, tasks, and other build-related settings. We will only cover a few types dependencies, but keep in mind there are more!

```
dependencies {
// Implementation dependencies are specific to a module and only used internally within the module
// Define the dependency with group:name:version
implementation 'org.apache.commons:commons-lang3:3.10'
// Use the 'api' keyword for dependencies that should be made accessible to other modules
api 'com.google.guava:guava:29.0-jre'
// Test dependencies are resources used exclusively for testing
testImplementation 'junit:junit:4.13'
}
repositories {
// Use Maven Central repository for most dependencies
mavenCentral()
}
```

During a build, Gradle finds and downloads each of the dependencies in a process called dependency resolution. It then stores resolved dependencies in a local cache called the dependency cache. Future builds use this cache to speed up the build process and avoid unnecessary network calls. This is yet another advantage of using build tools!

Gradle supports dynamic versions and classifier dependencies, which are useful for managing dependencies in a more nuanced way. For example, you can use a dynamic version like '4.+' to always use the latest 4.x.x version of a library:

```
implementation 'com.somecompany:somelibrary:4.+'
```

In addition, Gradle provides the ability to exclude certain transitive dependencies - indirect dependency relationship between software components - that you don't want to include, or to force a certain version of a library if there's a conflict between different modules:

```
implementation('com.somecompany:somelibrary:1.2.3') {
// Exclude certain dependencies
exclude group: 'log4j', module: 'log4j'
}
// Apply configuration rules to all dependency configurations in the project
configurations.all {
// In the case of a conflict, force a certain library version to be used
resolutionStrategy.force 'org.apache.commons:commons-lang3:3.9'
}
```

## Dependency Resolution in Gradle
You're likely to encounter version conflicts between dependencies. Gradle offers multiple strategies to help handle them:

### Consistent Versions with Platform Constraints
Gradle allows you to define a platform to enforce consistent versions of dependencies. A platform refers to a set of dependency versions for managing versions in order to maintain consistency and clean architecture. This is particularly useful in large multi-module projects. Here's how you might use it:

```
dependencies {
// Define your constraints under the dependencies block
constraints {
api('org.springframework:spring-framework-bom:5.2.7.RELEASE') {
// Provide a reason for using this specific version
because 'We want to align all Spring modules versions'
}
}
implementation 'org.springframework:spring-core'
implementation 'org.springframework:spring-context'
}
```

### Custom Dependency Resolution Logic
Finally, heres a sneak peak of the more advanced things that you can do with Gradle. Sometimes, you may need to apply custom logic for dependency resolution. This capability is particularly beneficial in projects where standard dependency rules don't align with project-specific requirements, such as managing version conflicts in complex multi-module projects, or when integrating with non-standard repositories. Gradle allows you to intercept and modify the dependency resolution process:

```
configurations.all {
resolutionStrategy.eachDependency { DependencyResolveDetails details ->
// Check if the current dependency is 'httpclient' from 'org.apache.httpcomponents'
if (details.requested.group == 'org.apache.httpcomponents' && details.requested.name == 'httpclient') {
// Force the use of a specific version of the 'httpclient' library
details.useVersion '4.5.10'
details.because 'Compatibility issues with newer versions'
}
}
}
```

Building a projects with many declared dependencies can be difficult to debug. In addition to the conflict resolution strategies listed above, Gradle has created tools to visualize and analyze a project’s dependency graph. You can use its Build Scan® tool to generate reports that tell you which dependencies failed to resolve. Read more about the Build Scan® tool here:
https://scans.gradle.com/

## Official documentation
Gradle official documentation:
https://docs.gradle.org/current/userguide/userguide.html

Loading

0 comments on commit 244ec10

Please sign in to comment.