-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
updated root and backend readme files, added one for infrastructure closes #37
- Loading branch information
1 parent
6ee27b9
commit 4519f8e
Showing
3 changed files
with
82 additions
and
14 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,20 +1,18 @@ | ||
This is the Google AppScript automation for generating the resulting scores and reports for ADO responses to the CMS Cloud ZTMM data call. | ||
# Zero Trust Maturity Framework (ZTMF) Scoring | ||
|
||
Dependencies: | ||
- Source Google Sheets results for data call | ||
- Google Drive space | ||
- Google Sheets report template | ||
|
||
The source Google sheets results for the data call is an output of the Typeform survey created for this data call. Tabs within this document include a listing of all questions and their respective full text answers, a list of all questions with the score for each answer, and a list of each pillar and functions mapped to the question number. | ||
The ZTMF Scoring Application allows ADOs to view their Zero Trust Maturity score online. An upcoming release will allow new ADOs to answer the questionnaire from scratch, and existing ADOs to update their answers, all within a web-based interface. The interface and the API are protected by AWS Verified Access which requires authentication via IDM (Okta). | ||
|
||
Associating this script with the sheets file within a google workspace will add a menu bar option named 'Score & Report'. Under this menu will be the options for each step within the process of creating reports. | ||
This monorepo contains the following major components: | ||
- `backend/` includes a GraphQL API and an ETL process both written in Go | ||
- `infrastructure/` includes all AWS resources as IaC managed by Terraform | ||
- `ui/` includes a React-based SPA written in Typescript | ||
- `.github/workflows` contains workflows for Github Actions to test, build, and deploy to AWS | ||
|
||
First, the 'Tabulate' function. This function takes the output from the TypeForm data call, compares each answer to the scoresheet, and outputs the corresponding score for each question into another sheet. In this output sheet the first 5 columns are copied directly as they are the submitter's association and contact information. | ||
## Architecture | ||
|
||
Next the 'vizTabulate' function. This function serves to slice out the scores for foundational categories (Visibility & Analytics, and Governance) into their individual pillar associated scores. Using the same process as the 'Tabulate' function, pillar associated question scores are appended to each row of the results sheet. | ||
The ZTMF Scoring Application is comprised of a React-based Single-Page Application (SPA) that retrieves data from the GraphQL API. The web assets for the SPA are hosted in an S3 bucket, and the API is hosted as an ECS service with containers deployed via Fargate. | ||
|
||
The 'Pillar_Score' function should be run next. This summarizes each response into scores by pillar into another new sheet. Scores are calculated using the data from the scores output and averaged based on the number of answered questions per pillar. This ensures that skipped or unknown questions are not factored into the scoring. | ||
Both the API ECS service, and the S3 bucket are configured as targets behind an _internal_ application load balancer (ALB), with S3 connectivity provided by PrivateLink VPC endpoints. The internal ALB is the target for the AWS Verified Access endpoint. The the public domain name points to the Verified Access endpoint which in turn acts as a proxy to the application, allowing access to only known trusted identites. AWS Verified Access is configured to use IDM (Okta) as the user identity trust provider. This allows users with the ZTMF job code to login via IDM and access the application. | ||
|
||
Once all this supporting information is calculated and stored the 'Generate_Slides' function is run. This function creates a copy of the report slides template. Then all of the relevant information for each entry is generated and populated. Submission information is copied from the score results sheet into the slides. A radar chart of the associated pillar scores is generated within sheets and then copied to the slides presentation. Then a score for each function within a pillar is calculated using the question mapping reference sheet and averaged based on how many questions were answered. This is copied into the slides report and then the raw scores are replaced with their respective level in the Zero Trust Maturity Model (T,A,O). Finally this renames the completed slides file to the name of the submitter's program. | ||
|
||
The 'Generate_Slides' function will create reports for every entry row in the results. | ||
Data delivered by the API is stored in an RDS Aurora serverless PostgreSQL server. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1 +1,38 @@ | ||
# api | ||
# Backend | ||
|
||
The backend is comprised of a GraphQL API and an ETL process both written in Go. | ||
|
||
## Developer Requirement and Config | ||
|
||
- Go ~>1.22.0 | ||
- Docker buildx | ||
|
||
## Application Architecture | ||
|
||
- `cmd/` contains code compiled as separate binaries | ||
- `api/` the graphql API | ||
- `elt` the etl process to pull score from csv into postgre | ||
- `internal/` contains libraries common to both binaries | ||
- `config/` pulls api and db settings from environment variables | ||
- `db/` is a lite wrapper around pgx that handles passing config and returning a db connection | ||
- `secrets/` is a lite wrapper around AWS secrets manager sdk to cache and refresh secrets that could potentially be rotated while a process is running | ||
|
||
### GraphQL API | ||
|
||
GraphQL was chosen because the primary purpose of this API is to drive the UI (see `ui/`). While REST APIs are popular and still make sense for some use cases, they often lead to the UI client making more requests than necessary and retrieving more data than necessary (aka "overfetching"). GraphQL is also faster and easier to develop when developing specifically for driving a UI. | ||
|
||
GraphQL functionality is provided by [github.com/graph-gophers/graphql-go](github.com/graph-gophers/graphql-go) with `main.go` providing intial bootstrapping of the lib along with the necessary http listener. | ||
|
||
The API is designed to serve with TLS when a certificate and key are provided, or serve unsecured http when not provided (useful for local development). The Dockerfile will generate a self-signed certificate which is fine since the containers are behind an AWS application load balancer which accepts untrusted certificates. | ||
|
||
`schema.go` contains the GraphQL schema, while the following files, usually named after db tables, provide query resolvers: | ||
- `fistmasystems.go` | ||
- `functions.go` | ||
- `functionscores.go` | ||
|
||
|
||
### Docker | ||
|
||
`Dockerfile` is a multi-stage build written to begin with a Debian-based image with Go and Go tools installed, and the second stage is `FROM scratch` to reduce the final image size to absolute minimum. OpenSSL is used to generate self-signed certificates to be used to run the API with `HTTP.ListenAndServeTLS` to acheive end-to-end encryption. | ||
|
||
Final images are tagged with the current commit SHA, pushed to the ECR private registry, and the commit SHA is then stored in an SSM Parameter so that Terraform can use that when it deploys a new container to ECS. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,33 @@ | ||
# Infrastructure as Code via Terraform | ||
|
||
All resources are managed by Terraform, and deployed into AWS in the `us-east-1` region. There are two environments, `dev` and `prod` each with their own account. Each account houses it's own S3 bucket that serves as a backend state store for Terraform. Both accounts include numerous resources such as VPCs that were created by CMS cloud and cloud security. These are referenced with `data` sources to avoid hardcoding ids. | ||
|
||
## State | ||
|
||
State is stored in S3 buckets, each environment with it's own bucket and state store. | ||
To switch environments you need to init terraform with a backend config like so: | ||
```bash | ||
terraform init -backend-config="config/backend-<env>.tf" -reconfigure | ||
``` | ||
Where `<env>` is one of `dev` or `prod`. See files in `infrastructure/config/`. | ||
|
||
## Vars | ||
|
||
While there are only 2 input vars, they still need to be passed in during plan and apply | ||
```bash | ||
terraform <plan|apply> -var-file="tfvars/<env>.tfvars" | ||
``` | ||
Where `<env>` is one of `dev` or `prod`. See files in `infrastructure/tfvars`. | ||
|
||
## Modules | ||
|
||
Only one custom module is used as a factory for IAM roles. CMS requires that all IAM roles include a `path` and `permissions_boundary`. These are expressed in `modules/role/main.tf` and all roles created for use by the application are created by calling the module like the following example: | ||
|
||
```hcl | ||
module <identifier> { | ||
name = <name> | ||
source = "./modules/role" | ||
principal = { Service = "ecs-tasks.amazonaws.com" } // example | ||
... | ||
} | ||
``` |