Graphql with Go boilerplate
The motivation is to create a boilerplate Go project code for anyone who favours modularity, scalability and separation of concerns.
This part is more of what it is rather than what it does, therefore lets spin it up.
In your home directoty using your terminal create a project:
mkdir <preferred project name>
cd <project name>
$ go mod init github.com/<project name> # enable dependencies
The project structure motivation is that graphql resolvers (in gql-gateway/
) only handle; request validation, calling the right service/repo methods, returning results. Repositories (in internal/repo/
) are the only layer that knows how to talk to the database. Models hold the business data structures, not logic.
Resolvers don’t know anything about SQL or how data is stored. This allows for swapping postgresql for another DB later without touching resolver and easier testing (you can mock repos without needing a DB)
Demo
├── db/ # DB setup logic
├── docker/ # Docker-related scripts/configs
├── gql-gateway/ # GraphQL resolvers and entry point
├── internal/
│ └── repo/ # Data access layer (repositories)
├── migrations/ # Database migration scripts
├── models/ # Data models used across the app
├── pkg/ # Common utility packages (e.g., JWT, auth)
├── tests/ # Integration and unit tests
│
├── docker-compose.yml # Local dev containers (app, db, etc.)
├── main.go # Entry point of the application
To demo how this works, we are going to create a hypothetical store with customers, categories, products, and orders database where products are stored in hierarchy of categories that the products fall into. Now that is out of the way. So I used pgx + raw sql queries to be able to have more control and access to db specific features. Not performance — this is a very controversial subject and I dont want to get into 😂
$ go get github.com/jackc/pgx/v5/pgxpool
Install this guy and add a file in the /migration folder like
$ touch Demo/migrations/001_init_schema.up.sql
Example table for orders
-- Create products
CREATE TABLE products (
id SERIAL PRIMARY KEY,
name TEXT NOT NULL,
description TEXT,
price NUMERIC(10, 2) NOT NULL,
category_id INTEGER REFERENCES categories(id) ON DELETE SET NULL
);
Inside your internal/repo folder (or any relevant folder), add
$ touch Demo/internal/repo/product.go
Do not forget to create a product struct. For my case, this resided inside models/models.go for simplicity
type Product struct {
ID int `json:"id"`
Name string `json:"name"`
Description *string `json:"description,omitempty"`
Price float64 `json:"price"`
CategoryID *int `json:"category_id,omitempty"`
}
Example repository to create a product
// inserts a new product
func (r *ProductRepo) CreateProduct(ctx context.Context, name string, description *string, price float64, categoryID *int) (*models.Product, error) {
var p models.Product
err := r.DB.QueryRow(ctx, `INSERT INTO products (name, description, price, category_id) VALUES ($1, $2, $3, $4) RETURNING id, name, description, price, category_id`, name, description, price, categoryID, ).Scan(&p.ID, &p.Name, &p.Description, &p.Price, &p.CategoryID)
if err != nil {
return nil, fmt.Errorf("create product: %w", err)
}
return &p, nil
}
We then set up a Postgresql database connection using the pgx
driver’s connection pool supported by the package we installed earlier. To support this and successfully run migrations, I used docker 😎
This is pretty straight forward. it starts a Postgresql container and automatically runs schema migrations once the DB is ready.
After creating the config .env, ensure you have initialized the repository in the server (main.go in this case)
productRepo := repo.NewProductRepo(database.Pool)
Significance of which you wil come to appreciate when we merge the startup servers for both the business logic and the graphql gateway.
Now run:
$ docker-compose up — build
If everything is setup well, you will see this:
Refer to my code on github for more information.
We will talk about graphql setup using gqlgen, a Go library which will facilitate the creationg of our gateway. Watch out on how to create those schema and leave a comment below.
See you soon!