Compare commits
197 Commits
8652fb7b07
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| 09cbb308d2 | |||
| c120b3d4d2 | |||
| 3381945cea | |||
| d12aabdccc | |||
| d38a348a06 | |||
| 42873f4d2d | |||
| c1bb006292 | |||
| 9bdaaeb295 | |||
| 6596ec2d9b | |||
| 0103aa8c16 | |||
| 37b13cfd6a | |||
| 29d27dccfb | |||
| cca27aa971 | |||
| 75bbb322d3 | |||
| 1a79b5fa9f | |||
| 9288aef5c7 | |||
| 29a2dfc606 | |||
| 2999325de9 | |||
| 06584ffedc | |||
| 90f91bd017 | |||
| 2b880be833 | |||
| 64a5a29809 | |||
| c1bae690b3 | |||
| f8e912e201 | |||
| ff8650bedf | |||
| 13586f5c44 | |||
| ead1e8d57c | |||
| 87d3044959 | |||
| cc2dc49511 | |||
| 79e72505f8 | |||
| 21c8f95f86 | |||
| 6a402795b9 | |||
| f342d99650 | |||
| 9d476ee209 | |||
| 88ff7d0077 | |||
| d91ea80bef | |||
| 682889f878 | |||
| f07e0be31d | |||
| 6ec2bbe02d | |||
| da68435673 | |||
| b6fdd922ba | |||
| 5648ea6c54 | |||
| d672ccd433 | |||
| a0614609a2 | |||
| 0ca29a894a | |||
| 5537385dad | |||
| dbe9193d21 | |||
| 1a3559f72b | |||
| 871bc28a19 | |||
| 2e719ca06d | |||
| b9e1c9546f | |||
| 4672e75bcf | |||
| d33e943dd4 | |||
| c4eaabaddc | |||
| 3dd7840b06 | |||
| 37cd721066 | |||
| 797dd1044b | |||
| 95aef784e1 | |||
| 9c5ee08284 | |||
| 8b047db9cc | |||
| 2351d799a7 | |||
| 8be92eda71 | |||
| 9f68bcffb5 | |||
| c3e06350dd | |||
| b509eff5b3 | |||
| dacedb417f | |||
| 8874fad520 | |||
| 493eb495a2 | |||
| e3a27e7779 | |||
| 566557f8b0 | |||
| 4e602e1783 | |||
| 081b8ae6ff | |||
| a0d959bdce | |||
| aa322301fb | |||
| 139b1defe7 | |||
| d499727050 | |||
| 647943abbc | |||
| e0b317cdf3 | |||
| f1a4fa002b | |||
| 385f92458f | |||
| ed9f98493f | |||
| eaa68c0355 | |||
| 1d24b113fd | |||
| 3c0a2f0a11 | |||
| ce5b8a19ee | |||
| 837d1c6a5d | |||
| 0822bc9eac | |||
| 70e7817f33 | |||
| 439e8bd489 | |||
| 7f1cfa3c98 | |||
| 8e46dfb3ac | |||
| 83d99ba809 | |||
| 5cbb6906a1 | |||
| d84646800c | |||
| 73f8184b05 | |||
| 183f0b9fd3 | |||
| 94127fdae4 | |||
| f90fa7dbf8 | |||
| 6002e48a44 | |||
| f534782978 | |||
| 2e030ded6c | |||
| 975fa533ca | |||
| 0bbf852776 | |||
| 91512af825 | |||
| 380e81014a | |||
| bccee6dd51 | |||
| 93a3c88852 | |||
| 12f9d78728 | |||
| cb66b6e687 | |||
| b338e68276 | |||
| 9b284ee9d2 | |||
| 3c4adf7dff | |||
| 8c84cfe27e | |||
| 5bd7726b1c | |||
| 15ee533cb1 | |||
| ab64aada3e | |||
| 3271c16489 | |||
| 5d632d1a57 | |||
| 01be7dc4ea | |||
| 4bd66d5e32 | |||
| 94049c3213 | |||
| abb5fc9558 | |||
| 4bb88c7b46 | |||
| 4334628792 | |||
| 0e5b782164 | |||
| b745a786a7 | |||
| 9c3a56c7cc | |||
| bf7e5aee42 | |||
| 891f8b1d7a | |||
| d7f46ae405 | |||
| 0e54b3d302 | |||
| c7c60af7fe | |||
| a305d23785 | |||
| c6fd2f8bae | |||
| a47b67faff | |||
| d0168cafe1 | |||
| afd0ec1001 | |||
| da9c128a8c | |||
| b26712e285 | |||
| ac004a0e8e | |||
| 0c6c138da5 | |||
| 79f62258e2 | |||
| 9fdf837c5a | |||
| adc8ab055a | |||
| 8c89a93532 | |||
| 36d0c6b054 | |||
| 1650823a73 | |||
| 9f96b6dd71 | |||
| f149643ebc | |||
| 2860089d26 | |||
| 6586e5e060 | |||
| afd61b403c | |||
| b09e5db6c0 | |||
| 12d71ef3bb | |||
| 760de11c42 | |||
| 11c86325df | |||
| dc7af726db | |||
| b95fac6927 | |||
| e2e246b52b | |||
| 885bf9a72e | |||
| d5ca3c6917 | |||
| a292f745bf | |||
| 71379d07db | |||
| dc7eaa3503 | |||
| a29a5a1ae7 | |||
| 30a6ea6230 | |||
| b60c7533ff | |||
| 213dc2be82 | |||
| f6ad8815e3 | |||
| b61a4d30dc | |||
| 60370ebb59 | |||
| 96514d3838 | |||
| 0673a1567f | |||
| 6ae5a6b1bb | |||
| 9d3bd17a03 | |||
| 32f0210aca | |||
| d2840e4157 | |||
| 343d99e8a9 | |||
| f3581ffe00 | |||
| ecfb8eeec0 | |||
| e07bdf2070 | |||
| 77fe45f814 | |||
| 959305c93c | |||
| 203a3f9b71 | |||
| 4a1ac8d31a | |||
| 53299b534b | |||
| 7583c51d07 | |||
| 1d8a0c660c | |||
| 4f93e60f3c | |||
| 35e968a3c5 | |||
| 8bccfa3a5d | |||
| 98eda4af96 | |||
| 9735218118 | |||
| 1bc2a274ae | |||
| a96f532cdf | |||
| cac332c6fb | |||
| 4850d85030 |
@@ -1,7 +1,6 @@
|
|||||||
keys:
|
keys:
|
||||||
- &host_tahani age1njjegjjdqzfnrr54f536yl4lduqgna3wuv7ef6vtl9jw5cju0grsgy62tm
|
- &host_tahani age1njjegjjdqzfnrr54f536yl4lduqgna3wuv7ef6vtl9jw5cju0grsgy62tm
|
||||||
- &host_michael age187jl7e4k9n4guygkmpuqzeh0wenefwrfkpvuyhvwjrjwxqpzassqq3x67j
|
- &host_michael age187jl7e4k9n4guygkmpuqzeh0wenefwrfkpvuyhvwjrjwxqpzassqq3x67j
|
||||||
- &host_mindy age1dqt3znmzcgghsjjzzax0pf0eyu95h0p7kaf5v988ysjv7fl7lumsatl048
|
|
||||||
- &host_jason age1ez6j3r5wdp0tjy7n5qzv5vfakdc2nh2zeu388zu7a80l0thv052syxq5e2
|
- &host_jason age1ez6j3r5wdp0tjy7n5qzv5vfakdc2nh2zeu388zu7a80l0thv052syxq5e2
|
||||||
- &host_chidi age1tlymdmaukhwupzrhszspp26lgd8s64rw4vu9lwc7gsgrjm78095s9fe9l3
|
- &host_chidi age1tlymdmaukhwupzrhszspp26lgd8s64rw4vu9lwc7gsgrjm78095s9fe9l3
|
||||||
creation_rules:
|
creation_rules:
|
||||||
@@ -10,6 +9,5 @@ creation_rules:
|
|||||||
- age:
|
- age:
|
||||||
- *host_tahani
|
- *host_tahani
|
||||||
- *host_michael
|
- *host_michael
|
||||||
- *host_mindy
|
|
||||||
- *host_jason
|
- *host_jason
|
||||||
- *host_chidi
|
- *host_chidi
|
||||||
|
|||||||
132
AGENTS.md
Normal file
132
AGENTS.md
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
# AGENTS.md
|
||||||
|
|
||||||
|
## Build Commands
|
||||||
|
|
||||||
|
### Local Development
|
||||||
|
```bash
|
||||||
|
nix run .#build # Build current host config
|
||||||
|
nix run .#build -- <hostname> # Build specific host (chidi, jason, michael, tahani)
|
||||||
|
nix run .#apply # Build and apply locally (darwin-rebuild/nixos-rebuild switch)
|
||||||
|
nix flake check # Validate flake
|
||||||
|
```
|
||||||
|
|
||||||
|
### Remote Deployment (NixOS only)
|
||||||
|
```bash
|
||||||
|
colmena build # Build all NixOS hosts
|
||||||
|
colmena apply --on <host> # Deploy to specific NixOS host (michael, tahani)
|
||||||
|
colmena apply # Deploy to all NixOS hosts
|
||||||
|
```
|
||||||
|
|
||||||
|
### Formatting
|
||||||
|
```bash
|
||||||
|
alejandra . # Format all Nix files
|
||||||
|
```
|
||||||
|
|
||||||
|
## Code Style
|
||||||
|
|
||||||
|
### Formatter
|
||||||
|
- **Tool**: Alejandra
|
||||||
|
- **Config**: `alejandra.toml` specifies tabs for indentation
|
||||||
|
- **Command**: Run `alejandra .` before committing
|
||||||
|
|
||||||
|
### File Structure
|
||||||
|
- **Hosts**: `hosts/<hostname>/` - Per-machine configurations
|
||||||
|
- Darwin: `chidi`, `jason`
|
||||||
|
- NixOS: `michael`, `tahani`
|
||||||
|
- **Profiles**: `profiles/` - Reusable program/service configurations (imported by hosts)
|
||||||
|
- **Modules**: `modules/` - Custom NixOS/darwin modules
|
||||||
|
- **Lib**: `lib/` - Shared constants and utilities
|
||||||
|
- **Secrets**: `secrets/` - SOPS-encrypted secrets (`.sops.yaml` for config)
|
||||||
|
|
||||||
|
### Nix Language Conventions
|
||||||
|
|
||||||
|
**Function Arguments**:
|
||||||
|
```nix
|
||||||
|
{inputs, pkgs, lib, ...}:
|
||||||
|
```
|
||||||
|
Destructure arguments on separate lines. Use `...` to capture remaining args.
|
||||||
|
|
||||||
|
**Imports**:
|
||||||
|
```nix
|
||||||
|
../../profiles/foo.nix
|
||||||
|
```
|
||||||
|
Use relative paths from file location, not absolute paths.
|
||||||
|
|
||||||
|
**Attribute Sets**:
|
||||||
|
```nix
|
||||||
|
options.my.gitea = {
|
||||||
|
enable = lib.mkEnableOption "Gitea git hosting service";
|
||||||
|
bucket = lib.mkOption {
|
||||||
|
type = lib.types.str;
|
||||||
|
description = "S3 bucket name";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
```
|
||||||
|
One attribute per line with trailing semicolons.
|
||||||
|
|
||||||
|
**Lists with Packages**:
|
||||||
|
```nix
|
||||||
|
with pkgs;
|
||||||
|
[
|
||||||
|
age
|
||||||
|
alejandra
|
||||||
|
ast-grep
|
||||||
|
]
|
||||||
|
```
|
||||||
|
Use `with pkgs;` for package lists, one item per line.
|
||||||
|
|
||||||
|
**Modules**:
|
||||||
|
```nix
|
||||||
|
{
|
||||||
|
config,
|
||||||
|
lib,
|
||||||
|
pkgs,
|
||||||
|
...
|
||||||
|
}:
|
||||||
|
with lib; let
|
||||||
|
cfg = config.my.feature;
|
||||||
|
in {
|
||||||
|
options.my.feature = {
|
||||||
|
enable = mkEnableOption "Feature description";
|
||||||
|
};
|
||||||
|
config = mkIf cfg.enable {
|
||||||
|
# configuration
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
- Destructure args on separate lines
|
||||||
|
- Use `with lib;` for brevity with NixOS lib functions
|
||||||
|
- Define `cfg` for config options
|
||||||
|
- Use `mkIf`, `mkForce`, `mkDefault` appropriately
|
||||||
|
|
||||||
|
**Conditional Platform-Specific Code**:
|
||||||
|
```nix
|
||||||
|
++ lib.optionals stdenv.isDarwin [
|
||||||
|
_1password-gui
|
||||||
|
dockutil
|
||||||
|
]
|
||||||
|
++ lib.optionals stdenv.isLinux [
|
||||||
|
lm_sensors
|
||||||
|
]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Naming Conventions
|
||||||
|
- **Option names**: `my.<feature>.<option>` for custom modules
|
||||||
|
- **Hostnames**: Lowercase, descriptive (e.g., `michael`, `tahani`)
|
||||||
|
- **Profile files**: Descriptive, lowercase with hyphens (e.g., `homebrew.nix`)
|
||||||
|
|
||||||
|
### Secrets Management
|
||||||
|
- Use SOPS for secrets (see `.sops.yaml`)
|
||||||
|
- Never commit unencrypted secrets
|
||||||
|
- Secrets files in `hosts/<host>/secrets.nix` import SOPS-generated files
|
||||||
|
|
||||||
|
### Imports Pattern
|
||||||
|
Host configs import:
|
||||||
|
1. System modules (`modulesPath + "/..."`)
|
||||||
|
2. Host-specific files (`./disk-config.nix`, `./hardware-configuration.nix`)
|
||||||
|
3. SOPS secrets (`./secrets.nix`)
|
||||||
|
4. Custom modules (`../../modules/*.nix`)
|
||||||
|
5. Base profiles (`../../profiles/*.nix`)
|
||||||
|
6. Input modules (`inputs.<module>.xxxModules.module`)
|
||||||
|
|
||||||
|
Home-manager users import profiles in a similar manner.
|
||||||
358
flake.lock
generated
358
flake.lock
generated
@@ -9,11 +9,11 @@
|
|||||||
"systems": "systems"
|
"systems": "systems"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1763308703,
|
"lastModified": 1769353768,
|
||||||
"narHash": "sha256-O9Y+Wer8wOh+N+4kcCK5p/VLrXyX+ktk0/s3HdZvJzk=",
|
"narHash": "sha256-zI+7cbMI4wMIR57jMjDSEsVb3grapTnURDxxJPYFIW0=",
|
||||||
"owner": "numtide",
|
"owner": "numtide",
|
||||||
"repo": "blueprint",
|
"repo": "blueprint",
|
||||||
"rev": "5a9bba070f801d63e2af3c9ef00b86b212429f4f",
|
"rev": "c7da5c70ad1c9b60b6f5d4f674fbe205d48d8f6c",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -25,20 +25,44 @@
|
|||||||
"brew-src": {
|
"brew-src": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1763638478,
|
"lastModified": 1769363988,
|
||||||
"narHash": "sha256-n/IMowE9S23ovmTkKX7KhxXC2Yq41EAVFR2FBIXPcT8=",
|
"narHash": "sha256-BiGPeulrDVetXP+tjxhMcGLUROZAtZIhU5m4MqawCfM=",
|
||||||
"owner": "Homebrew",
|
"owner": "Homebrew",
|
||||||
"repo": "brew",
|
"repo": "brew",
|
||||||
"rev": "fbfdbaba008189499958a7aeb1e2c36ab10c067d",
|
"rev": "d01011cac6d72032c75fd2cd9489909e95d9faf2",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
"owner": "Homebrew",
|
"owner": "Homebrew",
|
||||||
"ref": "5.0.3",
|
"ref": "5.0.12",
|
||||||
"repo": "brew",
|
"repo": "brew",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"colmena": {
|
||||||
|
"inputs": {
|
||||||
|
"flake-compat": "flake-compat",
|
||||||
|
"flake-utils": "flake-utils",
|
||||||
|
"nix-github-actions": "nix-github-actions",
|
||||||
|
"nixpkgs": [
|
||||||
|
"nixpkgs"
|
||||||
|
],
|
||||||
|
"stable": "stable"
|
||||||
|
},
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1762034856,
|
||||||
|
"narHash": "sha256-QVey3iP3UEoiFVXgypyjTvCrsIlA4ecx6Acaz5C8/PQ=",
|
||||||
|
"owner": "zhaofengli",
|
||||||
|
"repo": "colmena",
|
||||||
|
"rev": "349b035a5027f23d88eeb3bc41085d7ee29f18ed",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "zhaofengli",
|
||||||
|
"repo": "colmena",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"crane": {
|
"crane": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1765739568,
|
"lastModified": 1765739568,
|
||||||
@@ -61,11 +85,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766038392,
|
"lastModified": 1770184146,
|
||||||
"narHash": "sha256-ht/GuKaw5NT3M12xM+mkUtkSBVtzjJ8IHIy6R/ncv9g=",
|
"narHash": "sha256-DsqnN6LvXmohTRaal7tVZO/AKBuZ02kPBiZKSU4qa/k=",
|
||||||
"owner": "LnL7",
|
"owner": "LnL7",
|
||||||
"repo": "nix-darwin",
|
"repo": "nix-darwin",
|
||||||
"rev": "5fb45ece6129bd7ad8f7310df0ae9c00bae7c562",
|
"rev": "0d7874ef7e3ba02d58bebb871e6e29da36fa1b37",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -82,11 +106,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766150702,
|
"lastModified": 1769524058,
|
||||||
"narHash": "sha256-P0kM+5o+DKnB6raXgFEk3azw8Wqg5FL6wyl9jD+G5a4=",
|
"narHash": "sha256-zygdD6X1PcVNR2PsyK4ptzrVEiAdbMqLos7utrMDEWE=",
|
||||||
"owner": "nix-community",
|
"owner": "nix-community",
|
||||||
"repo": "disko",
|
"repo": "disko",
|
||||||
"rev": "916506443ecd0d0b4a0f4cf9d40a3c22ce39b378",
|
"rev": "71a3fc97d80881e91710fe721f1158d3b96ae14d",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -95,16 +119,54 @@
|
|||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"fenix": {
|
||||||
|
"inputs": {
|
||||||
|
"nixpkgs": [
|
||||||
|
"lumen",
|
||||||
|
"nixpkgs"
|
||||||
|
],
|
||||||
|
"rust-analyzer-src": "rust-analyzer-src"
|
||||||
|
},
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1768113825,
|
||||||
|
"narHash": "sha256-f09fAifGPEuRrz1DFY910jexq0DaBuQBbq7WcxQIUgs=",
|
||||||
|
"owner": "nix-community",
|
||||||
|
"repo": "fenix",
|
||||||
|
"rev": "55106e04d905c6a7726d0f6be77ed39a99f66a61",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "nix-community",
|
||||||
|
"repo": "fenix",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"flake-compat": {
|
||||||
|
"flake": false,
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1650374568,
|
||||||
|
"narHash": "sha256-Z+s0J8/r907g149rllvwhb4pKi8Wam5ij0st8PwAh+E=",
|
||||||
|
"owner": "edolstra",
|
||||||
|
"repo": "flake-compat",
|
||||||
|
"rev": "b4a34015c698c7793d592d66adbab377907a2be8",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "edolstra",
|
||||||
|
"repo": "flake-compat",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"flake-parts": {
|
"flake-parts": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"nixpkgs-lib": "nixpkgs-lib"
|
"nixpkgs-lib": "nixpkgs-lib"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1765835352,
|
"lastModified": 1769996383,
|
||||||
"narHash": "sha256-XswHlK/Qtjasvhd1nOa1e8MgZ8GS//jBoTqWtrS1Giw=",
|
"narHash": "sha256-AnYjnFWgS49RlqX7LrC4uA+sCCDBj0Ry/WOJ5XWAsa0=",
|
||||||
"owner": "hercules-ci",
|
"owner": "hercules-ci",
|
||||||
"repo": "flake-parts",
|
"repo": "flake-parts",
|
||||||
"rev": "a34fae9c08a15ad73f295041fec82323541400a9",
|
"rev": "57928607ea566b5db3ad13af0e57e921e6b12381",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -121,11 +183,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1765835352,
|
"lastModified": 1769996383,
|
||||||
"narHash": "sha256-XswHlK/Qtjasvhd1nOa1e8MgZ8GS//jBoTqWtrS1Giw=",
|
"narHash": "sha256-AnYjnFWgS49RlqX7LrC4uA+sCCDBj0Ry/WOJ5XWAsa0=",
|
||||||
"owner": "hercules-ci",
|
"owner": "hercules-ci",
|
||||||
"repo": "flake-parts",
|
"repo": "flake-parts",
|
||||||
"rev": "a34fae9c08a15ad73f295041fec82323541400a9",
|
"rev": "57928607ea566b5db3ad13af0e57e921e6b12381",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -135,8 +197,41 @@
|
|||||||
}
|
}
|
||||||
},
|
},
|
||||||
"flake-utils": {
|
"flake-utils": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1659877975,
|
||||||
|
"narHash": "sha256-zllb8aq3YO3h8B/U0/J1WBgAL8EX5yWf5pMj3G0NAmc=",
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"rev": "c0e246b9b83f637f4681389ecabcb2681b4f3af0",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"flake-utils_2": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"systems": "systems_3"
|
"systems": "systems_2"
|
||||||
|
},
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1731533236,
|
||||||
|
"narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=",
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"rev": "11707dc2f618dd54ca8739b309ec4fc024de578b",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "numtide",
|
||||||
|
"repo": "flake-utils",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"flake-utils_3": {
|
||||||
|
"inputs": {
|
||||||
|
"systems": "systems_4"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1731533236,
|
"lastModified": 1731533236,
|
||||||
@@ -159,11 +254,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766171975,
|
"lastModified": 1770586272,
|
||||||
"narHash": "sha256-47Ee0bTidhF/3/sHuYnWRuxcCrrm0mBNDxBkOTd3wWQ=",
|
"narHash": "sha256-Ucci8mu8QfxwzyfER2DQDbvW9t1BnTUJhBmY7ybralo=",
|
||||||
"owner": "nix-community",
|
"owner": "nix-community",
|
||||||
"repo": "home-manager",
|
"repo": "home-manager",
|
||||||
"rev": "bb35f07cc95a73aacbaf1f7f46bb8a3f40f265b5",
|
"rev": "b1f916ba052341edc1f80d4b2399f1092a4873ca",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -175,11 +270,11 @@
|
|||||||
"homebrew-cask": {
|
"homebrew-cask": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766247836,
|
"lastModified": 1770623639,
|
||||||
"narHash": "sha256-O1aIpgWPmDG87qMHwMO6FR0xOC5TyioubVU2TrP/EWU=",
|
"narHash": "sha256-LNLzbnhp5IEizTMMapF2FtLVD21sFzBfVgXcwNz7fKU=",
|
||||||
"owner": "homebrew",
|
"owner": "homebrew",
|
||||||
"repo": "homebrew-cask",
|
"repo": "homebrew-cask",
|
||||||
"rev": "fc4c83428bbc7d1cc48d35a5a525589a65c98188",
|
"rev": "c3bb7aedf0881187cbeb55ad2873240feba21603",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -191,11 +286,11 @@
|
|||||||
"homebrew-core": {
|
"homebrew-core": {
|
||||||
"flake": false,
|
"flake": false,
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766247665,
|
"lastModified": 1770627860,
|
||||||
"narHash": "sha256-u3aHcaR1uu/6QS9yUIJ+E4mK3Iwz69XPoKWVCrKbFbE=",
|
"narHash": "sha256-ihOndNFECGtZhkrtynP8nDJ8fbSxhNd2zWcq3CLDnQA=",
|
||||||
"owner": "homebrew",
|
"owner": "homebrew",
|
||||||
"repo": "homebrew-core",
|
"repo": "homebrew-core",
|
||||||
"rev": "dabb1ecfb4807e1343c7dfc2889f69e39294b1da",
|
"rev": "a12e59e6d202fc64aee013f8574c043a4c00a271",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -211,11 +306,11 @@
|
|||||||
"treefmt-nix": "treefmt-nix"
|
"treefmt-nix": "treefmt-nix"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766241293,
|
"lastModified": 1770616720,
|
||||||
"narHash": "sha256-bLJ1WgMMdKoxJYOp+84b9OYx3BPpTlh2xc5dSnRbEaw=",
|
"narHash": "sha256-NY7yFg3ZG0fzseC4SK/TQjgaODczuvCDtJZNsBmN2QU=",
|
||||||
"owner": "numtide",
|
"owner": "numtide",
|
||||||
"repo": "llm-agents.nix",
|
"repo": "llm-agents.nix",
|
||||||
"rev": "51ade74457af69a3360603856576c699c64aab99",
|
"rev": "09019dadd541051fc11f5008b56f4e8a14d2df4c",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -224,16 +319,59 @@
|
|||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"lumen": {
|
||||||
|
"inputs": {
|
||||||
|
"fenix": "fenix",
|
||||||
|
"flake-utils": "flake-utils_2",
|
||||||
|
"nixpkgs": [
|
||||||
|
"nixpkgs"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1769947964,
|
||||||
|
"narHash": "sha256-DElM5gwipT82puD7w5KMxG3PGiwozJ2VVXtwwPbwV5g=",
|
||||||
|
"owner": "jnsahaj",
|
||||||
|
"repo": "lumen",
|
||||||
|
"rev": "af5fa88eba126dc4508ddd307fd0a2c78f77c898",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "jnsahaj",
|
||||||
|
"repo": "lumen",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"nix-github-actions": {
|
||||||
|
"inputs": {
|
||||||
|
"nixpkgs": [
|
||||||
|
"colmena",
|
||||||
|
"nixpkgs"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1729742964,
|
||||||
|
"narHash": "sha256-B4mzTcQ0FZHdpeWcpDYPERtyjJd/NIuaQ9+BV1h+MpA=",
|
||||||
|
"owner": "nix-community",
|
||||||
|
"repo": "nix-github-actions",
|
||||||
|
"rev": "e04df33f62cdcf93d73e9a04142464753a16db67",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "nix-community",
|
||||||
|
"repo": "nix-github-actions",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"nix-homebrew": {
|
"nix-homebrew": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"brew-src": "brew-src"
|
"brew-src": "brew-src"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1764473698,
|
"lastModified": 1769437432,
|
||||||
"narHash": "sha256-C91gPgv6udN5WuIZWNehp8qdLqlrzX6iF/YyboOj6XI=",
|
"narHash": "sha256-8d7KnCpT2LweRvSzZYEGd9IM3eFX+A78opcnDM0+ndk=",
|
||||||
"owner": "zhaofengli-wip",
|
"owner": "zhaofengli-wip",
|
||||||
"repo": "nix-homebrew",
|
"repo": "nix-homebrew",
|
||||||
"rev": "6a8ab60bfd66154feeaa1021fc3b32684814a62a",
|
"rev": "a5409abd0d5013d79775d3419bcac10eacb9d8c5",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -244,11 +382,11 @@
|
|||||||
},
|
},
|
||||||
"nixpkgs": {
|
"nixpkgs": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766125104,
|
"lastModified": 1770537093,
|
||||||
"narHash": "sha256-l/YGrEpLromL4viUo5GmFH3K5M1j0Mb9O+LiaeCPWEM=",
|
"narHash": "sha256-pF1quXG5wsgtyuPOHcLfYg/ft/QMr8NnX0i6tW2187s=",
|
||||||
"owner": "NixOS",
|
"owner": "NixOS",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"rev": "7d853e518814cca2a657b72eeba67ae20ebf7059",
|
"rev": "fef9403a3e4d31b0a23f0bacebbec52c248fbb51",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -260,11 +398,11 @@
|
|||||||
},
|
},
|
||||||
"nixpkgs-lib": {
|
"nixpkgs-lib": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1765674936,
|
"lastModified": 1769909678,
|
||||||
"narHash": "sha256-k00uTP4JNfmejrCLJOwdObYC9jHRrr/5M/a/8L2EIdo=",
|
"narHash": "sha256-cBEymOf4/o3FD5AZnzC3J9hLbiZ+QDT/KDuyHXVJOpM=",
|
||||||
"owner": "nix-community",
|
"owner": "nix-community",
|
||||||
"repo": "nixpkgs.lib",
|
"repo": "nixpkgs.lib",
|
||||||
"rev": "2075416fcb47225d9b68ac469a5c4801a9c4dd85",
|
"rev": "72716169fe93074c333e8d0173151350670b824c",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -275,27 +413,27 @@
|
|||||||
},
|
},
|
||||||
"nixpkgs_2": {
|
"nixpkgs_2": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766070988,
|
"lastModified": 1770627848,
|
||||||
"narHash": "sha256-G/WVghka6c4bAzMhTwT2vjLccg/awmHkdKSd2JrycLc=",
|
"narHash": "sha256-pWVT4wjh+HKIdvGhph0vU1Kh48OSaSutPGpXxGNxSxw=",
|
||||||
"owner": "nixos",
|
"owner": "nixos",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"rev": "c6245e83d836d0433170a16eb185cefe0572f8b8",
|
"rev": "fe776c9fe2c37f51546bb50ced285ea2a365e7d9",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
"owner": "nixos",
|
"owner": "nixos",
|
||||||
"ref": "nixos-unstable",
|
"ref": "master",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"nixpkgs_3": {
|
"nixpkgs_3": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766025857,
|
"lastModified": 1770380644,
|
||||||
"narHash": "sha256-Lav5jJazCW4mdg1iHcROpuXqmM94BWJvabLFWaJVJp0=",
|
"narHash": "sha256-P7dWMHRUWG5m4G+06jDyThXO7kwSk46C1kgjEWcybkE=",
|
||||||
"owner": "NixOS",
|
"owner": "NixOS",
|
||||||
"repo": "nixpkgs",
|
"repo": "nixpkgs",
|
||||||
"rev": "def3da69945bbe338c373fddad5a1bb49cf199ce",
|
"rev": "ae67888ff7ef9dff69b3cf0cc0fbfbcd3a722abe",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -325,14 +463,14 @@
|
|||||||
"inputs": {
|
"inputs": {
|
||||||
"flake-parts": "flake-parts_2",
|
"flake-parts": "flake-parts_2",
|
||||||
"nixpkgs": "nixpkgs_3",
|
"nixpkgs": "nixpkgs_3",
|
||||||
"systems": "systems_2"
|
"systems": "systems_3"
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766168819,
|
"lastModified": 1770627083,
|
||||||
"narHash": "sha256-4GN1ii4kexLgjR8KPmyY9R4V1JloYAJKZUhuz/ijT28=",
|
"narHash": "sha256-Js8WrUwQ3lLRjWb8jGGE5npRN96E4mtPwyuNDuCDkcg=",
|
||||||
"owner": "nix-community",
|
"owner": "nix-community",
|
||||||
"repo": "nixvim",
|
"repo": "nixvim",
|
||||||
"rev": "6013b67dc9ae84ca20a250d5c159efbc5dd242fc",
|
"rev": "d354487c4692de3d0918170c45bde05175b12e30",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -341,8 +479,58 @@
|
|||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"nono": {
|
||||||
|
"flake": false,
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1770553882,
|
||||||
|
"narHash": "sha256-yEXw+rtuhoZvx1eO2Q+qPeGpvVbyASh7D9YEVAteoo8=",
|
||||||
|
"owner": "lukehinds",
|
||||||
|
"repo": "nono",
|
||||||
|
"rev": "e80983bb6a4058335e96e02eeabe17314f771a9c",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "lukehinds",
|
||||||
|
"repo": "nono",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"openusage": {
|
||||||
|
"flake": false,
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1770543295,
|
||||||
|
"narHash": "sha256-DvgEPZhFm06igalUPgnQ8VLkl0gk/3rm+lbEJ2/s7gM=",
|
||||||
|
"owner": "robinebers",
|
||||||
|
"repo": "openusage",
|
||||||
|
"rev": "22a7bd5f7856397400e60dd787ad82b23c763969",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "robinebers",
|
||||||
|
"ref": "v0.5.1",
|
||||||
|
"repo": "openusage",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"overseer": {
|
||||||
|
"flake": false,
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1770303305,
|
||||||
|
"narHash": "sha256-NM1haQAk1mWdmewgIv6tzApaIQxWKrIrri0+uXHY3Zc=",
|
||||||
|
"owner": "dmmulroy",
|
||||||
|
"repo": "overseer",
|
||||||
|
"rev": "5880d97939744ff72eb552c671da2fae1789041e",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "dmmulroy",
|
||||||
|
"repo": "overseer",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"root": {
|
"root": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
|
"colmena": "colmena",
|
||||||
"darwin": "darwin",
|
"darwin": "darwin",
|
||||||
"disko": "disko",
|
"disko": "disko",
|
||||||
"flake-parts": "flake-parts",
|
"flake-parts": "flake-parts",
|
||||||
@@ -350,13 +538,34 @@
|
|||||||
"homebrew-cask": "homebrew-cask",
|
"homebrew-cask": "homebrew-cask",
|
||||||
"homebrew-core": "homebrew-core",
|
"homebrew-core": "homebrew-core",
|
||||||
"llm-agents": "llm-agents",
|
"llm-agents": "llm-agents",
|
||||||
|
"lumen": "lumen",
|
||||||
"nix-homebrew": "nix-homebrew",
|
"nix-homebrew": "nix-homebrew",
|
||||||
"nixpkgs": "nixpkgs_2",
|
"nixpkgs": "nixpkgs_2",
|
||||||
"nixvim": "nixvim",
|
"nixvim": "nixvim",
|
||||||
|
"nono": "nono",
|
||||||
|
"openusage": "openusage",
|
||||||
|
"overseer": "overseer",
|
||||||
"sops-nix": "sops-nix",
|
"sops-nix": "sops-nix",
|
||||||
"zjstatus": "zjstatus"
|
"zjstatus": "zjstatus"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"rust-analyzer-src": {
|
||||||
|
"flake": false,
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1768083390,
|
||||||
|
"narHash": "sha256-TGWPJq2mXwxfAe83iZ18DIqXC4sOSj7RkW9b59h6Ox4=",
|
||||||
|
"owner": "rust-lang",
|
||||||
|
"repo": "rust-analyzer",
|
||||||
|
"rev": "e42e8ff582ba12a88b6845525d08b6428e6d0fb9",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "rust-lang",
|
||||||
|
"ref": "nightly",
|
||||||
|
"repo": "rust-analyzer",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"rust-overlay": {
|
"rust-overlay": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"nixpkgs": [
|
"nixpkgs": [
|
||||||
@@ -385,11 +594,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1765836173,
|
"lastModified": 1770526836,
|
||||||
"narHash": "sha256-hWRYfdH2ONI7HXbqZqW8Q1y9IRbnXWvtvt/ONZovSNY=",
|
"narHash": "sha256-xbvX5Ik+0inJcLJtJ/AajAt7xCk6FOCrm5ogpwwvVDg=",
|
||||||
"owner": "Mic92",
|
"owner": "Mic92",
|
||||||
"repo": "sops-nix",
|
"repo": "sops-nix",
|
||||||
"rev": "443a7f2e7e118c4fc63b7fae05ab3080dd0e5c63",
|
"rev": "d6e0e666048a5395d6ea4283143b7c9ac704720d",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -398,6 +607,22 @@
|
|||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"stable": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1750133334,
|
||||||
|
"narHash": "sha256-urV51uWH7fVnhIvsZIELIYalMYsyr2FCalvlRTzqWRw=",
|
||||||
|
"owner": "NixOS",
|
||||||
|
"repo": "nixpkgs",
|
||||||
|
"rev": "36ab78dab7da2e4e27911007033713bab534187b",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "NixOS",
|
||||||
|
"ref": "nixos-25.05",
|
||||||
|
"repo": "nixpkgs",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"systems": {
|
"systems": {
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1681028828,
|
"lastModified": 1681028828,
|
||||||
@@ -443,6 +668,21 @@
|
|||||||
"type": "github"
|
"type": "github"
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
"systems_4": {
|
||||||
|
"locked": {
|
||||||
|
"lastModified": 1681028828,
|
||||||
|
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
|
||||||
|
"owner": "nix-systems",
|
||||||
|
"repo": "default",
|
||||||
|
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
|
||||||
|
"type": "github"
|
||||||
|
},
|
||||||
|
"original": {
|
||||||
|
"owner": "nix-systems",
|
||||||
|
"repo": "default",
|
||||||
|
"type": "github"
|
||||||
|
}
|
||||||
|
},
|
||||||
"treefmt-nix": {
|
"treefmt-nix": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"nixpkgs": [
|
"nixpkgs": [
|
||||||
@@ -451,11 +691,11 @@
|
|||||||
]
|
]
|
||||||
},
|
},
|
||||||
"locked": {
|
"locked": {
|
||||||
"lastModified": 1766000401,
|
"lastModified": 1770228511,
|
||||||
"narHash": "sha256-+cqN4PJz9y0JQXfAK5J1drd0U05D5fcAGhzhfVrDlsI=",
|
"narHash": "sha256-wQ6NJSuFqAEmIg2VMnLdCnUc0b7vslUohqqGGD+Fyxk=",
|
||||||
"owner": "numtide",
|
"owner": "numtide",
|
||||||
"repo": "treefmt-nix",
|
"repo": "treefmt-nix",
|
||||||
"rev": "42d96e75aa56a3f70cab7e7dc4a32868db28e8fd",
|
"rev": "337a4fe074be1042a35086f15481d763b8ddc0e7",
|
||||||
"type": "github"
|
"type": "github"
|
||||||
},
|
},
|
||||||
"original": {
|
"original": {
|
||||||
@@ -467,7 +707,7 @@
|
|||||||
"zjstatus": {
|
"zjstatus": {
|
||||||
"inputs": {
|
"inputs": {
|
||||||
"crane": "crane",
|
"crane": "crane",
|
||||||
"flake-utils": "flake-utils",
|
"flake-utils": "flake-utils_3",
|
||||||
"nixpkgs": "nixpkgs_4",
|
"nixpkgs": "nixpkgs_4",
|
||||||
"rust-overlay": "rust-overlay"
|
"rust-overlay": "rust-overlay"
|
||||||
},
|
},
|
||||||
|
|||||||
112
flake.nix
112
flake.nix
@@ -2,7 +2,7 @@
|
|||||||
description = "Configuration for my macOS laptops and NixOS server";
|
description = "Configuration for my macOS laptops and NixOS server";
|
||||||
|
|
||||||
inputs = {
|
inputs = {
|
||||||
nixpkgs.url = "github:nixos/nixpkgs/nixos-unstable";
|
nixpkgs.url = "github:nixos/nixpkgs/master";
|
||||||
flake-parts.url = "github:hercules-ci/flake-parts";
|
flake-parts.url = "github:hercules-ci/flake-parts";
|
||||||
sops-nix = {
|
sops-nix = {
|
||||||
url = "github:Mic92/sops-nix";
|
url = "github:Mic92/sops-nix";
|
||||||
@@ -32,16 +32,42 @@
|
|||||||
url = "github:nix-community/disko";
|
url = "github:nix-community/disko";
|
||||||
inputs.nixpkgs.follows = "nixpkgs";
|
inputs.nixpkgs.follows = "nixpkgs";
|
||||||
};
|
};
|
||||||
|
colmena = {
|
||||||
|
url = "github:zhaofengli/colmena";
|
||||||
|
inputs.nixpkgs.follows = "nixpkgs";
|
||||||
|
};
|
||||||
|
lumen = {
|
||||||
|
url = "github:jnsahaj/lumen";
|
||||||
|
inputs.nixpkgs.follows = "nixpkgs";
|
||||||
|
};
|
||||||
|
nono = {
|
||||||
|
url = "github:lukehinds/nono";
|
||||||
|
flake = false;
|
||||||
|
};
|
||||||
|
overseer = {
|
||||||
|
url = "github:dmmulroy/overseer";
|
||||||
|
flake = false;
|
||||||
|
};
|
||||||
|
openusage = {
|
||||||
|
url = "github:robinebers/openusage/v0.5.1";
|
||||||
|
flake = false;
|
||||||
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
outputs = inputs @ {flake-parts, ...}:
|
outputs = inputs @ {flake-parts, ...}:
|
||||||
flake-parts.lib.mkFlake {inherit inputs;} (
|
flake-parts.lib.mkFlake {inherit inputs;} (
|
||||||
let
|
let
|
||||||
|
inherit (inputs.nixpkgs) lib;
|
||||||
constants = import ./lib/constants.nix;
|
constants = import ./lib/constants.nix;
|
||||||
user = constants.user;
|
inherit (constants) user;
|
||||||
|
|
||||||
darwinHosts = ["chidi" "jason"];
|
darwinHosts = ["chidi" "jason"];
|
||||||
nixosHosts = ["michael" "mindy" "tahani"];
|
nixosHosts = ["michael" "tahani"];
|
||||||
|
|
||||||
overlays = import ./overlays {inherit inputs;};
|
overlays = import ./overlays {inherit inputs;};
|
||||||
|
nixpkgsConfig = hostPlatform: {
|
||||||
|
nixpkgs = {inherit hostPlatform overlays;};
|
||||||
|
};
|
||||||
in {
|
in {
|
||||||
systems = [
|
systems = [
|
||||||
"x86_64-linux"
|
"x86_64-linux"
|
||||||
@@ -49,27 +75,23 @@
|
|||||||
];
|
];
|
||||||
|
|
||||||
flake.darwinConfigurations =
|
flake.darwinConfigurations =
|
||||||
inputs.nixpkgs.lib.genAttrs darwinHosts (
|
lib.genAttrs darwinHosts (
|
||||||
hostname:
|
hostname:
|
||||||
inputs.darwin.lib.darwinSystem {
|
inputs.darwin.lib.darwinSystem {
|
||||||
system = "aarch64-darwin";
|
specialArgs = {inherit inputs user hostname constants;};
|
||||||
specialArgs = {
|
|
||||||
inherit inputs user hostname constants;
|
|
||||||
};
|
|
||||||
modules = [
|
modules = [
|
||||||
inputs.home-manager.darwinModules.home-manager
|
inputs.home-manager.darwinModules.home-manager
|
||||||
inputs.nix-homebrew.darwinModules.nix-homebrew
|
inputs.nix-homebrew.darwinModules.nix-homebrew
|
||||||
|
(nixpkgsConfig "aarch64-darwin")
|
||||||
{
|
{
|
||||||
nixpkgs.overlays = overlays;
|
|
||||||
|
|
||||||
nix-homebrew = {
|
nix-homebrew = {
|
||||||
inherit user;
|
inherit user;
|
||||||
enable = true;
|
enable = true;
|
||||||
|
mutableTaps = true;
|
||||||
taps = {
|
taps = {
|
||||||
"homebrew/homebrew-core" = inputs.homebrew-core;
|
"homebrew/homebrew-core" = inputs.homebrew-core;
|
||||||
"homebrew/homebrew-cask" = inputs.homebrew-cask;
|
"homebrew/homebrew-cask" = inputs.homebrew-cask;
|
||||||
};
|
};
|
||||||
mutableTaps = true;
|
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
./hosts/${hostname}
|
./hosts/${hostname}
|
||||||
@@ -78,27 +100,54 @@
|
|||||||
);
|
);
|
||||||
|
|
||||||
flake.nixosConfigurations =
|
flake.nixosConfigurations =
|
||||||
inputs.nixpkgs.lib.genAttrs nixosHosts (
|
lib.genAttrs nixosHosts (
|
||||||
hostname:
|
hostname:
|
||||||
inputs.nixpkgs.lib.nixosSystem {
|
lib.nixosSystem {
|
||||||
system = "x86_64-linux";
|
specialArgs = {inherit inputs user hostname constants;};
|
||||||
specialArgs = {
|
|
||||||
inherit inputs user hostname constants;
|
|
||||||
};
|
|
||||||
modules = [
|
modules = [
|
||||||
inputs.home-manager.nixosModules.home-manager
|
inputs.home-manager.nixosModules.home-manager
|
||||||
{
|
(nixpkgsConfig "x86_64-linux")
|
||||||
nixpkgs.overlays = overlays;
|
|
||||||
}
|
|
||||||
./hosts/${hostname}
|
./hosts/${hostname}
|
||||||
];
|
];
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
|
|
||||||
|
flake.colmena =
|
||||||
|
{
|
||||||
|
meta = {
|
||||||
|
nixpkgs = inputs.nixpkgs.legacyPackages.x86_64-linux;
|
||||||
|
specialArgs = {inherit inputs user constants;};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
// lib.genAttrs nixosHosts (
|
||||||
|
hostname: {
|
||||||
|
deployment = {
|
||||||
|
targetHost = hostname;
|
||||||
|
targetUser = user;
|
||||||
|
};
|
||||||
|
imports = [
|
||||||
|
inputs.home-manager.nixosModules.home-manager
|
||||||
|
(nixpkgsConfig "x86_64-linux")
|
||||||
|
{_module.args.hostname = hostname;}
|
||||||
|
./hosts/${hostname}
|
||||||
|
];
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
flake.nixosModules = {
|
||||||
|
pgbackrest = ./modules/pgbackrest.nix;
|
||||||
|
};
|
||||||
|
|
||||||
|
flake.overlays = {
|
||||||
|
default = lib.composeManyExtensions overlays;
|
||||||
|
list = overlays;
|
||||||
|
};
|
||||||
|
|
||||||
|
flake.lib = {inherit constants;};
|
||||||
|
|
||||||
perSystem = {
|
perSystem = {
|
||||||
pkgs,
|
pkgs,
|
||||||
system,
|
system,
|
||||||
inputs',
|
|
||||||
...
|
...
|
||||||
}: let
|
}: let
|
||||||
mkApp = name: {
|
mkApp = name: {
|
||||||
@@ -117,27 +166,8 @@
|
|||||||
"rollback"
|
"rollback"
|
||||||
];
|
];
|
||||||
in {
|
in {
|
||||||
devShells.default =
|
apps = pkgs.lib.genAttrs appNames mkApp;
|
||||||
pkgs.mkShell {
|
|
||||||
nativeBuildInputs = with pkgs; [
|
|
||||||
bashInteractive
|
|
||||||
git
|
|
||||||
age
|
|
||||||
age-plugin-yubikey
|
|
||||||
];
|
|
||||||
shellHook = ''export EDITOR=nvim'';
|
|
||||||
};
|
|
||||||
|
|
||||||
apps =
|
|
||||||
builtins.listToAttrs (
|
|
||||||
map (n: {
|
|
||||||
name = n;
|
|
||||||
value = mkApp n;
|
|
||||||
})
|
|
||||||
appNames
|
|
||||||
);
|
|
||||||
};
|
};
|
||||||
flake.overlays = overlays;
|
|
||||||
}
|
}
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,74 +1,49 @@
|
|||||||
{
|
{
|
||||||
inputs,
|
|
||||||
pkgs,
|
pkgs,
|
||||||
|
inputs,
|
||||||
user,
|
user,
|
||||||
constants,
|
hostname,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
imports = [
|
imports = [
|
||||||
../../modules/syncthing.nix
|
./secrets.nix
|
||||||
../../profiles/core.nix
|
../../profiles/core.nix
|
||||||
../../profiles/darwin.nix
|
../../profiles/darwin.nix
|
||||||
../../profiles/dock.nix
|
../../profiles/dock.nix
|
||||||
../../profiles/homebrew.nix
|
../../profiles/homebrew.nix
|
||||||
../../profiles/syncthing.nix
|
|
||||||
../../profiles/tailscale.nix
|
../../profiles/tailscale.nix
|
||||||
inputs.sops-nix.darwinModules.sops
|
inputs.sops-nix.darwinModules.sops
|
||||||
];
|
];
|
||||||
|
|
||||||
networking.hostName = "chidi";
|
networking.hostName = hostname;
|
||||||
networking.computerName = "Chidi";
|
networking.computerName = hostname;
|
||||||
|
|
||||||
sops.age.keyFile = "/Users/${user}/.config/sops/age/keys.txt";
|
|
||||||
|
|
||||||
sops.secrets = {
|
|
||||||
chidi-syncthing-cert = {
|
|
||||||
sopsFile = ../../secrets/chidi-syncthing-cert;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/Users/${user}/.config/syncthing/cert.pem";
|
|
||||||
};
|
|
||||||
chidi-syncthing-key = {
|
|
||||||
sopsFile = ../../secrets/chidi-syncthing-key;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/Users/${user}/.config/syncthing/key.pem";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
services.syncthing.settings.folders = {
|
|
||||||
"Projects/Work" = {
|
|
||||||
path = "/Users/${user}/Projects/Work";
|
|
||||||
devices = ["tahani" "chidi"];
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
home-manager.users.${user} = {
|
home-manager.users.${user} = {
|
||||||
imports = [
|
imports = [
|
||||||
inputs.nixvim.homeModules.nixvim
|
|
||||||
../../profiles/atuin.nix
|
../../profiles/atuin.nix
|
||||||
|
../../profiles/aerospace.nix
|
||||||
../../profiles/bash.nix
|
../../profiles/bash.nix
|
||||||
../../profiles/bat.nix
|
../../profiles/bat.nix
|
||||||
../../profiles/direnv.nix
|
../../profiles/direnv.nix
|
||||||
../../profiles/eza.nix
|
../../profiles/nushell.nix
|
||||||
../../profiles/fish.nix
|
|
||||||
../../profiles/fzf.nix
|
../../profiles/fzf.nix
|
||||||
../../profiles/ghostty.nix
|
../../profiles/ghostty.nix
|
||||||
../../profiles/git.nix
|
../../profiles/git.nix
|
||||||
../../profiles/home.nix
|
../../profiles/home.nix
|
||||||
../../profiles/jjui.nix
|
|
||||||
../../profiles/jujutsu.nix
|
|
||||||
../../profiles/lazygit.nix
|
../../profiles/lazygit.nix
|
||||||
|
../../profiles/lumen.nix
|
||||||
../../profiles/mise.nix
|
../../profiles/mise.nix
|
||||||
|
../../profiles/nono.nix
|
||||||
../../profiles/neovim
|
../../profiles/neovim
|
||||||
../../profiles/opencode.nix
|
../../profiles/opencode.nix
|
||||||
|
../../profiles/claude-code.nix
|
||||||
../../profiles/ripgrep.nix
|
../../profiles/ripgrep.nix
|
||||||
../../profiles/ssh.nix
|
../../profiles/ssh.nix
|
||||||
../../profiles/starship.nix
|
../../profiles/starship.nix
|
||||||
../../profiles/zellij.nix
|
|
||||||
../../profiles/zk.nix
|
../../profiles/zk.nix
|
||||||
../../profiles/zoxide.nix
|
../../profiles/zoxide.nix
|
||||||
../../profiles/zsh.nix
|
../../profiles/zsh.nix
|
||||||
|
inputs.nixvim.homeModules.nixvim
|
||||||
];
|
];
|
||||||
fonts.fontconfig.enable = true;
|
fonts.fontconfig.enable = true;
|
||||||
programs.git.settings.user.email = "christoph@tuist.dev";
|
programs.git.settings.user.email = "christoph@tuist.dev";
|
||||||
|
|||||||
5
hosts/chidi/secrets.nix
Normal file
5
hosts/chidi/secrets.nix
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{user, ...}: {
|
||||||
|
sops.age.keyFile = "/Users/${user}/.config/sops/age/keys.txt";
|
||||||
|
sops.age.sshKeyPaths = [];
|
||||||
|
sops.gnupg.sshKeyPaths = [];
|
||||||
|
}
|
||||||
@@ -1,74 +1,48 @@
|
|||||||
{
|
{
|
||||||
inputs,
|
inputs,
|
||||||
user,
|
user,
|
||||||
|
hostname,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
imports = [
|
imports = [
|
||||||
../../modules/syncthing.nix
|
./secrets.nix
|
||||||
../../profiles/core.nix
|
../../profiles/core.nix
|
||||||
../../profiles/darwin.nix
|
../../profiles/darwin.nix
|
||||||
../../profiles/dock.nix
|
../../profiles/dock.nix
|
||||||
../../profiles/homebrew.nix
|
../../profiles/homebrew.nix
|
||||||
../../profiles/syncthing.nix
|
|
||||||
../../profiles/tailscale.nix
|
../../profiles/tailscale.nix
|
||||||
inputs.sops-nix.darwinModules.sops
|
inputs.sops-nix.darwinModules.sops
|
||||||
];
|
];
|
||||||
|
|
||||||
networking.hostName = "jason";
|
networking.hostName = hostname;
|
||||||
networking.computerName = "Jason";
|
networking.computerName = hostname;
|
||||||
|
|
||||||
services.syncthing.settings.folders = {
|
|
||||||
"Projects/Personal" = {
|
|
||||||
path = "/Users/${user}/Projects/Personal";
|
|
||||||
devices = ["tahani" "jason"];
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
sops.age.keyFile = "/Users/${user}/.config/sops/age/keys.txt";
|
|
||||||
sops.age.sshKeyPaths = [];
|
|
||||||
sops.gnupg.sshKeyPaths = [];
|
|
||||||
|
|
||||||
sops.secrets = {
|
|
||||||
jason-syncthing-cert = {
|
|
||||||
sopsFile = ../../secrets/jason-syncthing-cert;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/Users/${user}/.config/syncthing/cert.pem";
|
|
||||||
};
|
|
||||||
jason-syncthing-key = {
|
|
||||||
sopsFile = ../../secrets/jason-syncthing-key;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/Users/${user}/.config/syncthing/key.pem";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
home-manager.users.${user} = {
|
home-manager.users.${user} = {
|
||||||
imports = [
|
imports = [
|
||||||
inputs.nixvim.homeModules.nixvim
|
|
||||||
../../profiles/atuin.nix
|
../../profiles/atuin.nix
|
||||||
|
../../profiles/aerospace.nix
|
||||||
../../profiles/bash.nix
|
../../profiles/bash.nix
|
||||||
../../profiles/bat.nix
|
../../profiles/bat.nix
|
||||||
../../profiles/direnv.nix
|
../../profiles/direnv.nix
|
||||||
../../profiles/eza.nix
|
../../profiles/nushell.nix
|
||||||
../../profiles/fish.nix
|
|
||||||
../../profiles/fzf.nix
|
../../profiles/fzf.nix
|
||||||
../../profiles/ghostty.nix
|
../../profiles/ghostty.nix
|
||||||
../../profiles/git.nix
|
../../profiles/git.nix
|
||||||
../../profiles/home.nix
|
../../profiles/home.nix
|
||||||
../../profiles/jjui.nix
|
|
||||||
../../profiles/jujutsu.nix
|
|
||||||
../../profiles/lazygit.nix
|
../../profiles/lazygit.nix
|
||||||
|
../../profiles/lumen.nix
|
||||||
../../profiles/mise.nix
|
../../profiles/mise.nix
|
||||||
|
../../profiles/nono.nix
|
||||||
../../profiles/neovim
|
../../profiles/neovim
|
||||||
../../profiles/opencode.nix
|
../../profiles/opencode.nix
|
||||||
|
../../profiles/claude-code.nix
|
||||||
../../profiles/ripgrep.nix
|
../../profiles/ripgrep.nix
|
||||||
../../profiles/ssh.nix
|
../../profiles/ssh.nix
|
||||||
../../profiles/starship.nix
|
../../profiles/starship.nix
|
||||||
../../profiles/zellij.nix
|
|
||||||
../../profiles/zk.nix
|
../../profiles/zk.nix
|
||||||
../../profiles/zoxide.nix
|
../../profiles/zoxide.nix
|
||||||
../../profiles/zsh.nix
|
../../profiles/zsh.nix
|
||||||
|
inputs.nixvim.homeModules.nixvim
|
||||||
];
|
];
|
||||||
fonts.fontconfig.enable = true;
|
fonts.fontconfig.enable = true;
|
||||||
programs.git.settings.user.email = "christoph@schmatzler.com";
|
programs.git.settings.user.email = "christoph@schmatzler.com";
|
||||||
|
|||||||
5
hosts/jason/secrets.nix
Normal file
5
hosts/jason/secrets.nix
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{user, ...}: {
|
||||||
|
sops.age.keyFile = "/Users/${user}/.config/sops/age/keys.txt";
|
||||||
|
sops.age.sshKeyPaths = [];
|
||||||
|
sops.gnupg.sshKeyPaths = [];
|
||||||
|
}
|
||||||
@@ -1,9 +1,9 @@
|
|||||||
{
|
{
|
||||||
modulesPath,
|
config,
|
||||||
hostname,
|
|
||||||
inputs,
|
inputs,
|
||||||
user,
|
user,
|
||||||
constants,
|
hostname,
|
||||||
|
modulesPath,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
imports = [
|
imports = [
|
||||||
@@ -11,54 +11,38 @@
|
|||||||
(modulesPath + "/profiles/qemu-guest.nix")
|
(modulesPath + "/profiles/qemu-guest.nix")
|
||||||
./disk-config.nix
|
./disk-config.nix
|
||||||
./hardware-configuration.nix
|
./hardware-configuration.nix
|
||||||
|
./secrets.nix
|
||||||
|
../../modules/gitea.nix
|
||||||
../../profiles/core.nix
|
../../profiles/core.nix
|
||||||
../../profiles/fail2ban.nix
|
../../profiles/fail2ban.nix
|
||||||
../../profiles/gitea.nix
|
|
||||||
../../profiles/nixos.nix
|
../../profiles/nixos.nix
|
||||||
|
../../profiles/openssh.nix
|
||||||
../../profiles/tailscale.nix
|
../../profiles/tailscale.nix
|
||||||
inputs.disko.nixosModules.disko
|
inputs.disko.nixosModules.disko
|
||||||
inputs.sops-nix.nixosModules.sops
|
inputs.sops-nix.nixosModules.sops
|
||||||
];
|
];
|
||||||
|
|
||||||
sops.secrets.litestream = {
|
my.gitea = {
|
||||||
sopsFile = ../../secrets/michael-litestream;
|
|
||||||
format = "binary";
|
|
||||||
};
|
|
||||||
|
|
||||||
home-manager.users.${user} = {
|
|
||||||
pkgs,
|
|
||||||
lib,
|
|
||||||
...
|
|
||||||
}: {
|
|
||||||
_module.args = {inherit user constants inputs;};
|
|
||||||
imports = [
|
|
||||||
inputs.nixvim.homeModules.nixvim
|
|
||||||
../../profiles/bash.nix
|
|
||||||
../../profiles/bat.nix
|
|
||||||
../../profiles/direnv.nix
|
|
||||||
../../profiles/eza.nix
|
|
||||||
../../profiles/fish.nix
|
|
||||||
../../profiles/fzf.nix
|
|
||||||
../../profiles/git.nix
|
|
||||||
../../profiles/home.nix
|
|
||||||
../../profiles/jjui.nix
|
|
||||||
../../profiles/jujutsu.nix
|
|
||||||
../../profiles/lazygit.nix
|
|
||||||
../../profiles/neovim
|
|
||||||
../../profiles/ripgrep.nix
|
|
||||||
../../profiles/ssh.nix
|
|
||||||
../../profiles/starship.nix
|
|
||||||
../../profiles/zoxide.nix
|
|
||||||
];
|
|
||||||
};
|
|
||||||
|
|
||||||
services.openssh = {
|
|
||||||
enable = true;
|
enable = true;
|
||||||
settings = {
|
litestream = {
|
||||||
PermitRootLogin = "yes";
|
bucket = "michael-gitea-litestream";
|
||||||
PasswordAuthentication = false;
|
secretFile = config.sops.secrets.michael-gitea-litestream.path;
|
||||||
|
};
|
||||||
|
restic = {
|
||||||
|
bucket = "michael-gitea-repositories";
|
||||||
|
passwordFile = config.sops.secrets.michael-gitea-restic-password.path;
|
||||||
|
environmentFile = config.sops.secrets.michael-gitea-restic-env.path;
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
networking.hostName = hostname;
|
networking.hostName = hostname;
|
||||||
|
|
||||||
|
home-manager.users.${user} = {
|
||||||
|
imports = [
|
||||||
|
../../profiles/nushell.nix
|
||||||
|
../../profiles/home.nix
|
||||||
|
../../profiles/ssh.nix
|
||||||
|
inputs.nixvim.homeModules.nixvim
|
||||||
|
];
|
||||||
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
22
hosts/michael/secrets.nix
Normal file
22
hosts/michael/secrets.nix
Normal file
@@ -0,0 +1,22 @@
|
|||||||
|
{...}: {
|
||||||
|
sops.secrets = {
|
||||||
|
michael-gitea-litestream = {
|
||||||
|
sopsFile = ../../secrets/michael-gitea-litestream;
|
||||||
|
format = "binary";
|
||||||
|
owner = "gitea";
|
||||||
|
group = "gitea";
|
||||||
|
};
|
||||||
|
michael-gitea-restic-password = {
|
||||||
|
sopsFile = ../../secrets/michael-gitea-restic-password;
|
||||||
|
format = "binary";
|
||||||
|
owner = "gitea";
|
||||||
|
group = "gitea";
|
||||||
|
};
|
||||||
|
michael-gitea-restic-env = {
|
||||||
|
sopsFile = ../../secrets/michael-gitea-restic-env;
|
||||||
|
format = "binary";
|
||||||
|
owner = "gitea";
|
||||||
|
group = "gitea";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,75 +0,0 @@
|
|||||||
{
|
|
||||||
modulesPath,
|
|
||||||
hostname,
|
|
||||||
inputs,
|
|
||||||
user,
|
|
||||||
constants,
|
|
||||||
...
|
|
||||||
}: {
|
|
||||||
imports = [
|
|
||||||
(modulesPath + "/installer/scan/not-detected.nix")
|
|
||||||
(modulesPath + "/profiles/qemu-guest.nix")
|
|
||||||
./disk-config.nix
|
|
||||||
./hardware-configuration.nix
|
|
||||||
../../modules/pgbackrest.nix
|
|
||||||
../../profiles/core.nix
|
|
||||||
../../profiles/fail2ban.nix
|
|
||||||
../../profiles/nixos.nix
|
|
||||||
../../profiles/postgresql.nix
|
|
||||||
../../profiles/tailscale.nix
|
|
||||||
inputs.disko.nixosModules.disko
|
|
||||||
inputs.sops-nix.nixosModules.sops
|
|
||||||
];
|
|
||||||
|
|
||||||
sops.secrets.mindy-pgbackrest = {
|
|
||||||
sopsFile = ../../secrets/mindy-pgbackrest;
|
|
||||||
format = "binary";
|
|
||||||
owner = "postgres";
|
|
||||||
group = "postgres";
|
|
||||||
};
|
|
||||||
|
|
||||||
my.pgbackrest = {
|
|
||||||
enable = true;
|
|
||||||
secretFile = "/run/secrets/mindy-pgbackrest";
|
|
||||||
s3.bucket = "mindy-pgbackrest";
|
|
||||||
};
|
|
||||||
|
|
||||||
home-manager.users.${user} = {
|
|
||||||
pkgs,
|
|
||||||
lib,
|
|
||||||
...
|
|
||||||
}: {
|
|
||||||
_module.args = {inherit user constants inputs;};
|
|
||||||
imports = [
|
|
||||||
inputs.nixvim.homeModules.nixvim
|
|
||||||
../../profiles/bash.nix
|
|
||||||
../../profiles/bat.nix
|
|
||||||
../../profiles/direnv.nix
|
|
||||||
../../profiles/eza.nix
|
|
||||||
../../profiles/fish.nix
|
|
||||||
../../profiles/fzf.nix
|
|
||||||
../../profiles/git.nix
|
|
||||||
../../profiles/home.nix
|
|
||||||
../../profiles/jjui.nix
|
|
||||||
../../profiles/jujutsu.nix
|
|
||||||
../../profiles/lazygit.nix
|
|
||||||
../../profiles/neovim
|
|
||||||
../../profiles/ripgrep.nix
|
|
||||||
../../profiles/ssh.nix
|
|
||||||
../../profiles/starship.nix
|
|
||||||
../../profiles/zoxide.nix
|
|
||||||
];
|
|
||||||
};
|
|
||||||
|
|
||||||
services.openssh = {
|
|
||||||
enable = true;
|
|
||||||
settings = {
|
|
||||||
PermitRootLogin = "yes";
|
|
||||||
PasswordAuthentication = false;
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
virtualisation.docker.enable = true;
|
|
||||||
|
|
||||||
networking.hostName = hostname;
|
|
||||||
}
|
|
||||||
@@ -1,37 +0,0 @@
|
|||||||
{
|
|
||||||
disko.devices = {
|
|
||||||
disk = {
|
|
||||||
main = {
|
|
||||||
type = "disk";
|
|
||||||
device = "/dev/sda";
|
|
||||||
content = {
|
|
||||||
type = "gpt";
|
|
||||||
partitions = {
|
|
||||||
boot = {
|
|
||||||
size = "1M";
|
|
||||||
type = "EF02";
|
|
||||||
};
|
|
||||||
ESP = {
|
|
||||||
size = "512M";
|
|
||||||
type = "EF00";
|
|
||||||
content = {
|
|
||||||
type = "filesystem";
|
|
||||||
format = "vfat";
|
|
||||||
mountpoint = "/boot";
|
|
||||||
mountOptions = ["umask=0077"];
|
|
||||||
};
|
|
||||||
};
|
|
||||||
root = {
|
|
||||||
size = "100%";
|
|
||||||
content = {
|
|
||||||
type = "filesystem";
|
|
||||||
format = "ext4";
|
|
||||||
mountpoint = "/";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,18 +0,0 @@
|
|||||||
{
|
|
||||||
lib,
|
|
||||||
modulesPath,
|
|
||||||
...
|
|
||||||
}: {
|
|
||||||
imports = [
|
|
||||||
(modulesPath + "/profiles/qemu-guest.nix")
|
|
||||||
];
|
|
||||||
|
|
||||||
boot.initrd.availableKernelModules = ["ahci" "xhci_pci" "virtio_pci" "virtio_scsi" "sd_mod" "sr_mod"];
|
|
||||||
boot.initrd.kernelModules = [];
|
|
||||||
boot.kernelModules = [];
|
|
||||||
boot.extraModulePackages = [];
|
|
||||||
|
|
||||||
nixpkgs.hostPlatform = lib.mkDefault "x86_64-linux";
|
|
||||||
|
|
||||||
networking.useDHCP = lib.mkDefault true;
|
|
||||||
}
|
|
||||||
57
hosts/tahani/adguardhome.nix
Normal file
57
hosts/tahani/adguardhome.nix
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
{
|
||||||
|
services.adguardhome = {
|
||||||
|
enable = true;
|
||||||
|
host = "0.0.0.0";
|
||||||
|
port = 10000;
|
||||||
|
settings = {
|
||||||
|
dns = {
|
||||||
|
upstream_dns = [
|
||||||
|
"1.1.1.1"
|
||||||
|
"1.0.0.1"
|
||||||
|
];
|
||||||
|
};
|
||||||
|
filtering = {
|
||||||
|
protection_enabled = true;
|
||||||
|
filtering_enabled = true;
|
||||||
|
safe_search = {
|
||||||
|
enabled = false;
|
||||||
|
};
|
||||||
|
safebrowsing_enabled = true;
|
||||||
|
blocked_response_ttl = 10;
|
||||||
|
filters_update_interval = 24;
|
||||||
|
blocked_services = {
|
||||||
|
ids = [
|
||||||
|
"reddit"
|
||||||
|
"twitter"
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
filters = [
|
||||||
|
{
|
||||||
|
enabled = true;
|
||||||
|
url = "https://cdn.jsdelivr.net/gh/hagezi/dns-blocklists@latest/adblock/pro.txt";
|
||||||
|
name = "HaGeZi Multi PRO";
|
||||||
|
id = 1;
|
||||||
|
}
|
||||||
|
{
|
||||||
|
enabled = true;
|
||||||
|
url = "https://cdn.jsdelivr.net/gh/hagezi/dns-blocklists@latest/adblock/tif.txt";
|
||||||
|
name = "HaGeZi Threat Intelligence Feeds";
|
||||||
|
id = 2;
|
||||||
|
}
|
||||||
|
{
|
||||||
|
enabled = true;
|
||||||
|
url = "https://cdn.jsdelivr.net/gh/hagezi/dns-blocklists@latest/adblock/gambling.txt";
|
||||||
|
name = "HaGeZi Gambling";
|
||||||
|
id = 3;
|
||||||
|
}
|
||||||
|
{
|
||||||
|
enabled = true;
|
||||||
|
url = "https://cdn.jsdelivr.net/gh/hagezi/dns-blocklists@latest/adblock/nsfw.txt";
|
||||||
|
name = "HaGeZi NSFW";
|
||||||
|
id = 4;
|
||||||
|
}
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
10
hosts/tahani/cache.nix
Normal file
10
hosts/tahani/cache.nix
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
{...}: {
|
||||||
|
services.caddy.virtualHosts."cache.manticore-hippocampus.ts.net" = {
|
||||||
|
extraConfig = ''
|
||||||
|
tls {
|
||||||
|
get_certificate tailscale
|
||||||
|
}
|
||||||
|
reverse_proxy localhost:32843
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,144 +1,63 @@
|
|||||||
{
|
{
|
||||||
config,
|
pkgs,
|
||||||
hostname,
|
|
||||||
user,
|
|
||||||
inputs,
|
inputs,
|
||||||
constants,
|
user,
|
||||||
|
hostname,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
imports = [
|
imports = [
|
||||||
|
./adguardhome.nix
|
||||||
|
./cache.nix
|
||||||
|
./networking.nix
|
||||||
|
./paperless.nix
|
||||||
|
./secrets.nix
|
||||||
../../profiles/core.nix
|
../../profiles/core.nix
|
||||||
../../profiles/nixos.nix
|
../../profiles/nixos.nix
|
||||||
../../profiles/syncthing.nix
|
../../profiles/openssh.nix
|
||||||
../../profiles/tailscale.nix
|
../../profiles/tailscale.nix
|
||||||
inputs.sops-nix.nixosModules.sops
|
inputs.sops-nix.nixosModules.sops
|
||||||
];
|
];
|
||||||
|
|
||||||
|
networking.hostName = hostname;
|
||||||
|
|
||||||
home-manager.users.${user} = {
|
home-manager.users.${user} = {
|
||||||
pkgs,
|
|
||||||
lib,
|
|
||||||
...
|
|
||||||
}: {
|
|
||||||
_module.args = {inherit user constants inputs;};
|
|
||||||
imports = [
|
imports = [
|
||||||
inputs.nixvim.homeModules.nixvim
|
|
||||||
../../profiles/atuin.nix
|
../../profiles/atuin.nix
|
||||||
../../profiles/bash.nix
|
../../profiles/bash.nix
|
||||||
../../profiles/bat.nix
|
../../profiles/bat.nix
|
||||||
../../profiles/direnv.nix
|
../../profiles/direnv.nix
|
||||||
../../profiles/eza.nix
|
../../profiles/nushell.nix
|
||||||
../../profiles/fish.nix
|
|
||||||
../../profiles/fzf.nix
|
../../profiles/fzf.nix
|
||||||
../../profiles/git.nix
|
../../profiles/git.nix
|
||||||
../../profiles/home.nix
|
../../profiles/home.nix
|
||||||
../../profiles/jjui.nix
|
|
||||||
../../profiles/jujutsu.nix
|
|
||||||
../../profiles/lazygit.nix
|
../../profiles/lazygit.nix
|
||||||
|
../../profiles/lumen.nix
|
||||||
../../profiles/mise.nix
|
../../profiles/mise.nix
|
||||||
|
../../profiles/nono.nix
|
||||||
../../profiles/neovim
|
../../profiles/neovim
|
||||||
../../profiles/opencode.nix
|
../../profiles/opencode.nix
|
||||||
|
../../profiles/overseer.nix
|
||||||
|
../../profiles/claude-code.nix
|
||||||
../../profiles/ripgrep.nix
|
../../profiles/ripgrep.nix
|
||||||
../../profiles/ssh.nix
|
../../profiles/ssh.nix
|
||||||
../../profiles/starship.nix
|
../../profiles/starship.nix
|
||||||
../../profiles/zellij.nix
|
|
||||||
../../profiles/zk.nix
|
../../profiles/zk.nix
|
||||||
../../profiles/zoxide.nix
|
../../profiles/zoxide.nix
|
||||||
../../profiles/zsh.nix
|
../../profiles/zsh.nix
|
||||||
];
|
inputs.nixvim.homeModules.nixvim
|
||||||
|
|
||||||
home.packages = [
|
|
||||||
inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.amp
|
|
||||||
inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.beads
|
|
||||||
];
|
];
|
||||||
|
|
||||||
programs.git.settings.user.email = "christoph@schmatzler.com";
|
programs.git.settings.user.email = "christoph@schmatzler.com";
|
||||||
};
|
};
|
||||||
|
|
||||||
services.adguardhome = {
|
virtualisation.docker.enable = true;
|
||||||
enable = true;
|
|
||||||
port = 10000;
|
|
||||||
settings = {
|
|
||||||
dns = {
|
|
||||||
upstream_dns = [
|
|
||||||
"1.1.1.1"
|
|
||||||
"1.0.0.1"
|
|
||||||
];
|
|
||||||
};
|
|
||||||
filtering = {
|
|
||||||
protection_enabled = true;
|
|
||||||
filtering_enabled = true;
|
|
||||||
safe_search = {
|
|
||||||
enabled = false;
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
virtualisation.docker = {
|
users.users.${user}.extraGroups = ["docker"];
|
||||||
enable = true;
|
|
||||||
};
|
|
||||||
|
|
||||||
services.openssh = {
|
swapDevices = [
|
||||||
enable = true;
|
{
|
||||||
settings = {
|
device = "/swapfile";
|
||||||
PermitRootLogin = "prohibit-password";
|
size = 16 * 1024;
|
||||||
PasswordAuthentication = false;
|
}
|
||||||
};
|
];
|
||||||
};
|
|
||||||
|
|
||||||
fileSystems."/" = {
|
|
||||||
device = "/dev/disk/by-label/NIXROOT";
|
|
||||||
fsType = "ext4";
|
|
||||||
};
|
|
||||||
|
|
||||||
fileSystems."/boot" = {
|
|
||||||
device = "/dev/disk/by-label/NIXBOOT";
|
|
||||||
fsType = "vfat";
|
|
||||||
};
|
|
||||||
|
|
||||||
networking = {
|
|
||||||
hostName = hostname;
|
|
||||||
useDHCP = false;
|
|
||||||
interfaces.eno1.ipv4.addresses = [
|
|
||||||
{
|
|
||||||
address = "192.168.1.10";
|
|
||||||
prefixLength = 24;
|
|
||||||
}
|
|
||||||
];
|
|
||||||
defaultGateway = "192.168.1.1";
|
|
||||||
nameservers = ["1.1.1.1"];
|
|
||||||
firewall = {
|
|
||||||
enable = true;
|
|
||||||
trustedInterfaces = ["eno1" "tailscale0"];
|
|
||||||
allowedUDPPorts = [config.services.tailscale.port];
|
|
||||||
allowedTCPPorts = [22 5555];
|
|
||||||
checkReversePath = "loose";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
sops.secrets = {
|
|
||||||
tahani-syncthing-cert = {
|
|
||||||
sopsFile = ../../secrets/tahani-syncthing-cert;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/home/${user}/.config/syncthing/cert.pem";
|
|
||||||
};
|
|
||||||
tahani-syncthing-key = {
|
|
||||||
sopsFile = ../../secrets/tahani-syncthing-key;
|
|
||||||
format = "binary";
|
|
||||||
owner = user;
|
|
||||||
path = "/home/${user}/.config/syncthing/key.pem";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
services.syncthing.settings.folders = {
|
|
||||||
"Projects/Personal" = {
|
|
||||||
path = "/home/${user}/Projects/Personal";
|
|
||||||
devices = ["tahani" "jason"];
|
|
||||||
};
|
|
||||||
"Projects/Work" = {
|
|
||||||
path = "/home/${user}/Projects/Work";
|
|
||||||
devices = ["tahani" "chidi"];
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}
|
}
|
||||||
|
|||||||
38
hosts/tahani/networking.nix
Normal file
38
hosts/tahani/networking.nix
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
{config, ...}: {
|
||||||
|
services.tailscale.extraSetFlags = ["--accept-routes=false"];
|
||||||
|
|
||||||
|
networking = {
|
||||||
|
useDHCP = false;
|
||||||
|
interfaces.eno1.ipv4.addresses = [
|
||||||
|
{
|
||||||
|
address = "192.168.1.10";
|
||||||
|
prefixLength = 24;
|
||||||
|
}
|
||||||
|
];
|
||||||
|
defaultGateway = "192.168.1.1";
|
||||||
|
nameservers = ["1.1.1.1"];
|
||||||
|
firewall = {
|
||||||
|
enable = true;
|
||||||
|
trustedInterfaces = ["eno1" "tailscale0"];
|
||||||
|
allowedUDPPorts = [
|
||||||
|
53
|
||||||
|
config.services.tailscale.port
|
||||||
|
];
|
||||||
|
allowedTCPPorts = [
|
||||||
|
22
|
||||||
|
53
|
||||||
|
];
|
||||||
|
checkReversePath = "loose";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
fileSystems."/" = {
|
||||||
|
device = "/dev/disk/by-label/NIXROOT";
|
||||||
|
fsType = "ext4";
|
||||||
|
};
|
||||||
|
|
||||||
|
fileSystems."/boot" = {
|
||||||
|
device = "/dev/disk/by-label/NIXBOOT";
|
||||||
|
fsType = "vfat";
|
||||||
|
};
|
||||||
|
}
|
||||||
73
hosts/tahani/paperless.nix
Normal file
73
hosts/tahani/paperless.nix
Normal file
@@ -0,0 +1,73 @@
|
|||||||
|
{config, ...}: {
|
||||||
|
services.caddy = {
|
||||||
|
enable = true;
|
||||||
|
globalConfig = ''
|
||||||
|
admin off
|
||||||
|
'';
|
||||||
|
virtualHosts."docs.manticore-hippocampus.ts.net" = {
|
||||||
|
extraConfig = ''
|
||||||
|
tls {
|
||||||
|
get_certificate tailscale
|
||||||
|
}
|
||||||
|
reverse_proxy localhost:${toString config.services.paperless.port}
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
virtualHosts."docs-ai.manticore-hippocampus.ts.net" = {
|
||||||
|
extraConfig = ''
|
||||||
|
tls {
|
||||||
|
get_certificate tailscale
|
||||||
|
}
|
||||||
|
reverse_proxy localhost:3000
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
virtualisation.oci-containers = {
|
||||||
|
backend = "docker";
|
||||||
|
containers.paperless-ai = {
|
||||||
|
image = "clusterzx/paperless-ai:latest";
|
||||||
|
autoStart = true;
|
||||||
|
volumes = [
|
||||||
|
"paperless-ai-data:/app/data"
|
||||||
|
];
|
||||||
|
environment = {
|
||||||
|
PUID = "1000";
|
||||||
|
PGID = "1000";
|
||||||
|
PAPERLESS_AI_PORT = "3000";
|
||||||
|
# Initial setup wizard will configure the rest
|
||||||
|
PAPERLESS_AI_INITIAL_SETUP = "yes";
|
||||||
|
# Paperless-ngx API URL accessible from container (using host network)
|
||||||
|
PAPERLESS_API_URL = "http://127.0.0.1:${toString config.services.paperless.port}/api";
|
||||||
|
};
|
||||||
|
extraOptions = [
|
||||||
|
"--network=host"
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
services.redis.servers.paperless = {
|
||||||
|
enable = true;
|
||||||
|
port = 6379;
|
||||||
|
bind = "127.0.0.1";
|
||||||
|
settings = {
|
||||||
|
maxmemory = "256mb";
|
||||||
|
maxmemory-policy = "allkeys-lru";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
services.paperless = {
|
||||||
|
enable = true;
|
||||||
|
address = "0.0.0.0";
|
||||||
|
passwordFile = config.sops.secrets.tahani-paperless-password.path;
|
||||||
|
settings = {
|
||||||
|
PAPERLESS_DBENGINE = "sqlite";
|
||||||
|
PAPERLESS_REDIS = "redis://127.0.0.1:6379";
|
||||||
|
PAPERLESS_CONSUMER_IGNORE_PATTERN = [
|
||||||
|
".DS_STORE/*"
|
||||||
|
"desktop.ini"
|
||||||
|
];
|
||||||
|
PAPERLESS_OCR_LANGUAGE = "deu+eng";
|
||||||
|
PAPERLESS_CSRF_TRUSTED_ORIGINS = "https://docs.manticore-hippocampus.ts.net";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
8
hosts/tahani/secrets.nix
Normal file
8
hosts/tahani/secrets.nix
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
{...}: {
|
||||||
|
sops.secrets = {
|
||||||
|
tahani-paperless-password = {
|
||||||
|
sopsFile = ../../secrets/tahani-paperless-password;
|
||||||
|
format = "binary";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
20
lib/build-rust-package.nix
Normal file
20
lib/build-rust-package.nix
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
{
|
||||||
|
input,
|
||||||
|
prev,
|
||||||
|
}: let
|
||||||
|
manifest = (prev.lib.importTOML "${input}/Cargo.toml").package;
|
||||||
|
in
|
||||||
|
prev.rustPlatform.buildRustPackage {
|
||||||
|
pname = manifest.name;
|
||||||
|
version = manifest.version;
|
||||||
|
|
||||||
|
cargoLock.lockFile = "${input}/Cargo.lock";
|
||||||
|
|
||||||
|
src = input;
|
||||||
|
|
||||||
|
nativeBuildInputs = [prev.pkg-config];
|
||||||
|
buildInputs = [prev.openssl];
|
||||||
|
OPENSSL_NO_VENDOR = 1;
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
}
|
||||||
198
modules/gitea.nix
Normal file
198
modules/gitea.nix
Normal file
@@ -0,0 +1,198 @@
|
|||||||
|
{
|
||||||
|
config,
|
||||||
|
lib,
|
||||||
|
pkgs,
|
||||||
|
...
|
||||||
|
}:
|
||||||
|
with lib; let
|
||||||
|
cfg = config.my.gitea;
|
||||||
|
in {
|
||||||
|
options.my.gitea = {
|
||||||
|
enable = mkEnableOption "Gitea git hosting service";
|
||||||
|
|
||||||
|
litestream = {
|
||||||
|
bucket =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = "S3 bucket name for Litestream database replication";
|
||||||
|
};
|
||||||
|
|
||||||
|
secretFile =
|
||||||
|
mkOption {
|
||||||
|
type = types.path;
|
||||||
|
description = "Path to the environment file containing S3 credentials for Litestream";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
restic = {
|
||||||
|
bucket =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = "S3 bucket name for Restic repository backups";
|
||||||
|
};
|
||||||
|
|
||||||
|
passwordFile =
|
||||||
|
mkOption {
|
||||||
|
type = types.path;
|
||||||
|
description = "Path to the file containing the Restic repository password";
|
||||||
|
};
|
||||||
|
|
||||||
|
environmentFile =
|
||||||
|
mkOption {
|
||||||
|
type = types.path;
|
||||||
|
description = "Path to the environment file containing S3 credentials for Restic";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
s3 = {
|
||||||
|
endpoint =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "s3.eu-central-003.backblazeb2.com";
|
||||||
|
description = "S3 endpoint URL";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
config =
|
||||||
|
mkIf cfg.enable {
|
||||||
|
networking.firewall.allowedTCPPorts = [80 443];
|
||||||
|
|
||||||
|
services.redis.servers.gitea = {
|
||||||
|
enable = true;
|
||||||
|
port = 6380;
|
||||||
|
bind = "127.0.0.1";
|
||||||
|
settings = {
|
||||||
|
maxmemory = "64mb";
|
||||||
|
maxmemory-policy = "allkeys-lru";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
services.gitea = {
|
||||||
|
enable = true;
|
||||||
|
database = {
|
||||||
|
type = "sqlite3";
|
||||||
|
path = "/var/lib/gitea/data/gitea.db";
|
||||||
|
};
|
||||||
|
settings = {
|
||||||
|
server = {
|
||||||
|
ROOT_URL = "https://git.schmatzler.com/";
|
||||||
|
DOMAIN = "git.schmatzler.com";
|
||||||
|
HTTP_ADDR = "127.0.0.1";
|
||||||
|
HTTP_PORT = 3000;
|
||||||
|
LANDING_PAGE = "explore";
|
||||||
|
};
|
||||||
|
service.DISABLE_REGISTRATION = true;
|
||||||
|
security.INSTALL_LOCK = true;
|
||||||
|
cache = {
|
||||||
|
ADAPTER = "redis";
|
||||||
|
HOST = "redis://127.0.0.1:6380/0?pool_size=100&idle_timeout=180s";
|
||||||
|
ITEM_TTL = "16h";
|
||||||
|
};
|
||||||
|
"cache.last_commit" = {
|
||||||
|
ITEM_TTL = "8760h";
|
||||||
|
COMMITS_COUNT = 100;
|
||||||
|
};
|
||||||
|
session = {
|
||||||
|
PROVIDER = "redis";
|
||||||
|
PROVIDER_CONFIG = "redis://127.0.0.1:6380/1?pool_size=100&idle_timeout=180s";
|
||||||
|
COOKIE_SECURE = true;
|
||||||
|
SAME_SITE = "strict";
|
||||||
|
};
|
||||||
|
api.ENABLE_SWAGGER = false;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
services.litestream = {
|
||||||
|
enable = true;
|
||||||
|
environmentFile = cfg.litestream.secretFile;
|
||||||
|
settings = {
|
||||||
|
dbs = [
|
||||||
|
{
|
||||||
|
path = "/var/lib/gitea/data/gitea.db";
|
||||||
|
replicas = [
|
||||||
|
{
|
||||||
|
type = "s3";
|
||||||
|
bucket = cfg.litestream.bucket;
|
||||||
|
path = "gitea";
|
||||||
|
endpoint = cfg.s3.endpoint;
|
||||||
|
}
|
||||||
|
];
|
||||||
|
}
|
||||||
|
];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
systemd.services.litestream = {
|
||||||
|
serviceConfig = {
|
||||||
|
User = mkForce "gitea";
|
||||||
|
Group = mkForce "gitea";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
services.caddy = {
|
||||||
|
enable = true;
|
||||||
|
virtualHosts."git.schmatzler.com".extraConfig = ''
|
||||||
|
header {
|
||||||
|
Strict-Transport-Security "max-age=31536000; includeSubDomains"
|
||||||
|
X-Content-Type-Options "nosniff"
|
||||||
|
X-Frame-Options "DENY"
|
||||||
|
Referrer-Policy "strict-origin-when-cross-origin"
|
||||||
|
}
|
||||||
|
reverse_proxy localhost:3000
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
services.restic.backups.gitea = {
|
||||||
|
repository = "s3:${cfg.s3.endpoint}/${cfg.restic.bucket}";
|
||||||
|
paths = ["/var/lib/gitea"];
|
||||||
|
exclude = [
|
||||||
|
"/var/lib/gitea/log"
|
||||||
|
"/var/lib/gitea/data/gitea.db"
|
||||||
|
"/var/lib/gitea/data/gitea.db-shm"
|
||||||
|
"/var/lib/gitea/data/gitea.db-wal"
|
||||||
|
];
|
||||||
|
passwordFile = cfg.restic.passwordFile;
|
||||||
|
environmentFile = cfg.restic.environmentFile;
|
||||||
|
pruneOpts = [
|
||||||
|
"--keep-daily 7"
|
||||||
|
"--keep-weekly 4"
|
||||||
|
"--keep-monthly 6"
|
||||||
|
];
|
||||||
|
timerConfig = {
|
||||||
|
OnCalendar = "daily";
|
||||||
|
Persistent = true;
|
||||||
|
RandomizedDelaySec = "1h";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
systemd.services.restic-backups-gitea = {
|
||||||
|
wants = ["restic-init-gitea.service"];
|
||||||
|
after = ["restic-init-gitea.service"];
|
||||||
|
serviceConfig = {
|
||||||
|
User = mkForce "gitea";
|
||||||
|
Group = mkForce "gitea";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
systemd.services.restic-init-gitea = {
|
||||||
|
description = "Initialize Restic repository for Gitea backups";
|
||||||
|
wantedBy = ["multi-user.target"];
|
||||||
|
after = ["network-online.target"];
|
||||||
|
wants = ["network-online.target"];
|
||||||
|
path = [pkgs.restic];
|
||||||
|
serviceConfig = {
|
||||||
|
Type = "oneshot";
|
||||||
|
User = "gitea";
|
||||||
|
Group = "gitea";
|
||||||
|
RemainAfterExit = true;
|
||||||
|
EnvironmentFile = cfg.restic.environmentFile;
|
||||||
|
};
|
||||||
|
script = ''
|
||||||
|
export RESTIC_PASSWORD=$(cat ${cfg.restic.passwordFile})
|
||||||
|
restic -r s3:${cfg.s3.endpoint}/${cfg.restic.bucket} snapshots &>/dev/null || \
|
||||||
|
restic -r s3:${cfg.s3.endpoint}/${cfg.restic.bucket} init
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -10,231 +10,248 @@ in {
|
|||||||
options.my.pgbackrest = {
|
options.my.pgbackrest = {
|
||||||
enable = mkEnableOption "pgBackRest PostgreSQL backup";
|
enable = mkEnableOption "pgBackRest PostgreSQL backup";
|
||||||
|
|
||||||
stanza = mkOption {
|
stanza =
|
||||||
type = types.str;
|
mkOption {
|
||||||
default = "main";
|
type = types.str;
|
||||||
description = "Name of the pgBackRest stanza";
|
default = "main";
|
||||||
};
|
description = "Name of the pgBackRest stanza";
|
||||||
|
|
||||||
secretFile = mkOption {
|
|
||||||
type = types.path;
|
|
||||||
description = "Path to the environment file containing S3 credentials and cipher passphrase";
|
|
||||||
};
|
|
||||||
|
|
||||||
s3 = mkOption {
|
|
||||||
type = types.submodule {
|
|
||||||
options = {
|
|
||||||
endpoint = mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "s3.eu-central-003.backblazeb2.com";
|
|
||||||
description = "S3 endpoint URL";
|
|
||||||
};
|
|
||||||
|
|
||||||
bucket = mkOption {
|
|
||||||
type = types.str;
|
|
||||||
description = "S3 bucket name";
|
|
||||||
};
|
|
||||||
|
|
||||||
region = mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "eu-central-003";
|
|
||||||
description = "S3 region";
|
|
||||||
};
|
|
||||||
|
|
||||||
path = mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "/backups";
|
|
||||||
description = "Path within the S3 bucket";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
};
|
||||||
default = {};
|
|
||||||
description = "S3 storage configuration";
|
|
||||||
};
|
|
||||||
|
|
||||||
retention = mkOption {
|
secretFile =
|
||||||
type = types.submodule {
|
mkOption {
|
||||||
options = {
|
type = types.path;
|
||||||
full = mkOption {
|
description = "Path to the environment file containing S3 credentials and cipher passphrase";
|
||||||
type = types.int;
|
|
||||||
default = 7;
|
|
||||||
description = "Number of full backups to retain";
|
|
||||||
};
|
|
||||||
|
|
||||||
diff = mkOption {
|
|
||||||
type = types.int;
|
|
||||||
default = 7;
|
|
||||||
description = "Number of differential backups to retain";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
};
|
||||||
default = {};
|
|
||||||
description = "Backup retention configuration";
|
|
||||||
};
|
|
||||||
|
|
||||||
compression = mkOption {
|
s3 =
|
||||||
type = types.submodule {
|
mkOption {
|
||||||
options = {
|
type =
|
||||||
type = mkOption {
|
types.submodule {
|
||||||
type = types.str;
|
options = {
|
||||||
default = "zst";
|
endpoint =
|
||||||
description = "Compression algorithm (none, gz, lz4, zst)";
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "s3.eu-central-003.backblazeb2.com";
|
||||||
|
description = "S3 endpoint URL";
|
||||||
|
};
|
||||||
|
bucket =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
description = "S3 bucket name";
|
||||||
|
};
|
||||||
|
region =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "eu-central-003";
|
||||||
|
description = "S3 region";
|
||||||
|
};
|
||||||
|
path =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "/backups";
|
||||||
|
description = "Path within the S3 bucket";
|
||||||
|
};
|
||||||
|
};
|
||||||
};
|
};
|
||||||
|
default = {};
|
||||||
level = mkOption {
|
description = "S3 storage configuration";
|
||||||
type = types.int;
|
|
||||||
default = 3;
|
|
||||||
description = "Compression level";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
};
|
||||||
default = {};
|
|
||||||
description = "Compression configuration";
|
|
||||||
};
|
|
||||||
|
|
||||||
processMax = mkOption {
|
retention =
|
||||||
type = types.int;
|
mkOption {
|
||||||
default = 2;
|
type =
|
||||||
description = "Maximum number of processes for parallel operations";
|
types.submodule {
|
||||||
};
|
options = {
|
||||||
|
full =
|
||||||
schedule = mkOption {
|
mkOption {
|
||||||
type = types.submodule {
|
type = types.int;
|
||||||
options = {
|
default = 7;
|
||||||
full = mkOption {
|
description = "Number of full backups to retain";
|
||||||
type = types.str;
|
};
|
||||||
default = "daily";
|
diff =
|
||||||
description = "OnCalendar expression for full backups";
|
mkOption {
|
||||||
|
type = types.int;
|
||||||
|
default = 7;
|
||||||
|
description = "Number of differential backups to retain";
|
||||||
|
};
|
||||||
|
};
|
||||||
};
|
};
|
||||||
|
default = {};
|
||||||
diff = mkOption {
|
description = "Backup retention configuration";
|
||||||
type = types.str;
|
};
|
||||||
default = "hourly";
|
|
||||||
description = "OnCalendar expression for differential backups";
|
compression =
|
||||||
};
|
mkOption {
|
||||||
};
|
type =
|
||||||
|
types.submodule {
|
||||||
|
options = {
|
||||||
|
type =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "zst";
|
||||||
|
description = "Compression algorithm (none, gz, lz4, zst)";
|
||||||
|
};
|
||||||
|
level =
|
||||||
|
mkOption {
|
||||||
|
type = types.int;
|
||||||
|
default = 3;
|
||||||
|
description = "Compression level";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
default = {};
|
||||||
|
description = "Compression configuration";
|
||||||
|
};
|
||||||
|
|
||||||
|
processMax =
|
||||||
|
mkOption {
|
||||||
|
type = types.int;
|
||||||
|
default = 2;
|
||||||
|
description = "Maximum number of processes for parallel operations";
|
||||||
|
};
|
||||||
|
|
||||||
|
schedule =
|
||||||
|
mkOption {
|
||||||
|
type =
|
||||||
|
types.submodule {
|
||||||
|
options = {
|
||||||
|
full =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "daily";
|
||||||
|
description = "OnCalendar expression for full backups";
|
||||||
|
};
|
||||||
|
diff =
|
||||||
|
mkOption {
|
||||||
|
type = types.str;
|
||||||
|
default = "hourly";
|
||||||
|
description = "OnCalendar expression for differential backups";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
default = {};
|
||||||
|
description = "Backup schedule configuration";
|
||||||
};
|
};
|
||||||
default = {};
|
|
||||||
description = "Backup schedule configuration";
|
|
||||||
};
|
|
||||||
};
|
};
|
||||||
|
|
||||||
config = mkIf cfg.enable (let
|
config =
|
||||||
archivePushScript = pkgs.writeShellScript "pgbackrest-archive-push" ''
|
mkIf cfg.enable (let
|
||||||
set -a
|
archivePushScript =
|
||||||
source ${cfg.secretFile}
|
pkgs.writeShellScript "pgbackrest-archive-push" ''
|
||||||
set +a
|
set -a
|
||||||
exec ${pkgs.pgbackrest}/bin/pgbackrest --stanza=${cfg.stanza} archive-push "$1"
|
source ${cfg.secretFile}
|
||||||
'';
|
set +a
|
||||||
in {
|
exec ${pkgs.pgbackrest}/bin/pgbackrest --stanza=${cfg.stanza} archive-push "$1"
|
||||||
environment.systemPackages = [
|
'';
|
||||||
pkgs.pgbackrest
|
in {
|
||||||
(pkgs.writeShellScriptBin "pgbackrest-wrapper" ''
|
environment.systemPackages = [
|
||||||
set -a
|
pkgs.pgbackrest
|
||||||
source ${cfg.secretFile}
|
(pkgs.writeShellScriptBin "pgbackrest-wrapper" ''
|
||||||
set +a
|
set -a
|
||||||
exec ${pkgs.pgbackrest}/bin/pgbackrest "$@"
|
source ${cfg.secretFile}
|
||||||
'')
|
set +a
|
||||||
];
|
exec ${pkgs.pgbackrest}/bin/pgbackrest "$@"
|
||||||
|
'')
|
||||||
|
];
|
||||||
|
|
||||||
services.postgresql.settings = {
|
services.postgresql.settings = {
|
||||||
archive_mode = "on";
|
archive_mode = "on";
|
||||||
archive_command = "${archivePushScript} %p";
|
archive_command = "${archivePushScript} %p";
|
||||||
};
|
|
||||||
|
|
||||||
environment.etc."pgbackrest/pgbackrest.conf".text = ''
|
|
||||||
[global]
|
|
||||||
repo1-type=s3
|
|
||||||
repo1-s3-endpoint=${cfg.s3.endpoint}
|
|
||||||
repo1-s3-bucket=${cfg.s3.bucket}
|
|
||||||
repo1-s3-region=${cfg.s3.region}
|
|
||||||
repo1-path=${cfg.s3.path}
|
|
||||||
repo1-retention-full=${toString cfg.retention.full}
|
|
||||||
repo1-retention-diff=${toString cfg.retention.diff}
|
|
||||||
repo1-cipher-type=aes-256-cbc
|
|
||||||
compress-type=${cfg.compression.type}
|
|
||||||
compress-level=${toString cfg.compression.level}
|
|
||||||
process-max=${toString cfg.processMax}
|
|
||||||
log-level-console=info
|
|
||||||
log-level-file=detail
|
|
||||||
log-path=/var/log/pgbackrest
|
|
||||||
spool-path=/var/spool/pgbackrest
|
|
||||||
|
|
||||||
[${cfg.stanza}]
|
|
||||||
pg1-path=/var/lib/postgresql/${config.services.postgresql.package.psqlSchema}
|
|
||||||
pg1-user=postgres
|
|
||||||
'';
|
|
||||||
|
|
||||||
systemd.services.pgbackrest-stanza-create = {
|
|
||||||
description = "pgBackRest Stanza Create";
|
|
||||||
after = ["postgresql.service"];
|
|
||||||
requires = ["postgresql.service"];
|
|
||||||
path = [pkgs.pgbackrest];
|
|
||||||
serviceConfig = {
|
|
||||||
Type = "oneshot";
|
|
||||||
User = "postgres";
|
|
||||||
EnvironmentFile = cfg.secretFile;
|
|
||||||
RemainAfterExit = true;
|
|
||||||
};
|
};
|
||||||
script = ''
|
|
||||||
pgbackrest --stanza=${cfg.stanza} stanza-create || true
|
environment.etc."pgbackrest/pgbackrest.conf".text = ''
|
||||||
|
[global]
|
||||||
|
repo1-type=s3
|
||||||
|
repo1-s3-endpoint=${cfg.s3.endpoint}
|
||||||
|
repo1-s3-bucket=${cfg.s3.bucket}
|
||||||
|
repo1-s3-region=${cfg.s3.region}
|
||||||
|
repo1-path=${cfg.s3.path}
|
||||||
|
repo1-retention-full=${toString cfg.retention.full}
|
||||||
|
repo1-retention-diff=${toString cfg.retention.diff}
|
||||||
|
repo1-cipher-type=aes-256-cbc
|
||||||
|
compress-type=${cfg.compression.type}
|
||||||
|
compress-level=${toString cfg.compression.level}
|
||||||
|
process-max=${toString cfg.processMax}
|
||||||
|
log-level-console=info
|
||||||
|
log-level-file=detail
|
||||||
|
log-path=/var/log/pgbackrest
|
||||||
|
spool-path=/var/spool/pgbackrest
|
||||||
|
|
||||||
|
[${cfg.stanza}]
|
||||||
|
pg1-path=/var/lib/postgresql/${config.services.postgresql.package.psqlSchema}
|
||||||
|
pg1-user=postgres
|
||||||
'';
|
'';
|
||||||
};
|
|
||||||
|
|
||||||
systemd.services.pgbackrest-backup = {
|
systemd.services.pgbackrest-stanza-create = {
|
||||||
description = "pgBackRest Full Backup";
|
description = "pgBackRest Stanza Create";
|
||||||
after = ["postgresql.service" "pgbackrest-stanza-create.service"];
|
after = ["postgresql.service"];
|
||||||
requires = ["postgresql.service"];
|
requires = ["postgresql.service"];
|
||||||
wants = ["pgbackrest-stanza-create.service"];
|
path = [pkgs.pgbackrest];
|
||||||
path = [pkgs.pgbackrest];
|
serviceConfig = {
|
||||||
serviceConfig = {
|
Type = "oneshot";
|
||||||
Type = "oneshot";
|
User = "postgres";
|
||||||
User = "postgres";
|
EnvironmentFile = cfg.secretFile;
|
||||||
EnvironmentFile = cfg.secretFile;
|
RemainAfterExit = true;
|
||||||
|
};
|
||||||
|
script = ''
|
||||||
|
pgbackrest --stanza=${cfg.stanza} stanza-create || true
|
||||||
|
'';
|
||||||
};
|
};
|
||||||
script = ''
|
|
||||||
pgbackrest --stanza=${cfg.stanza} backup --type=full
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
|
|
||||||
systemd.timers.pgbackrest-backup = {
|
systemd.services.pgbackrest-backup = {
|
||||||
wantedBy = ["timers.target"];
|
description = "pgBackRest Full Backup";
|
||||||
timerConfig = {
|
after = ["postgresql.service" "pgbackrest-stanza-create.service"];
|
||||||
OnCalendar = cfg.schedule.full;
|
requires = ["postgresql.service"];
|
||||||
Persistent = true;
|
wants = ["pgbackrest-stanza-create.service"];
|
||||||
RandomizedDelaySec = "1h";
|
path = [pkgs.pgbackrest];
|
||||||
|
serviceConfig = {
|
||||||
|
Type = "oneshot";
|
||||||
|
User = "postgres";
|
||||||
|
EnvironmentFile = cfg.secretFile;
|
||||||
|
};
|
||||||
|
script = ''
|
||||||
|
pgbackrest --stanza=${cfg.stanza} backup --type=full
|
||||||
|
'';
|
||||||
};
|
};
|
||||||
};
|
|
||||||
|
|
||||||
systemd.services.pgbackrest-backup-diff = {
|
systemd.timers.pgbackrest-backup = {
|
||||||
description = "pgBackRest Differential Backup";
|
wantedBy = ["timers.target"];
|
||||||
after = ["postgresql.service" "pgbackrest-stanza-create.service"];
|
timerConfig = {
|
||||||
requires = ["postgresql.service"];
|
OnCalendar = cfg.schedule.full;
|
||||||
wants = ["pgbackrest-stanza-create.service"];
|
Persistent = true;
|
||||||
path = [pkgs.pgbackrest];
|
RandomizedDelaySec = "1h";
|
||||||
serviceConfig = {
|
};
|
||||||
Type = "oneshot";
|
|
||||||
User = "postgres";
|
|
||||||
EnvironmentFile = cfg.secretFile;
|
|
||||||
};
|
};
|
||||||
script = ''
|
|
||||||
pgbackrest --stanza=${cfg.stanza} backup --type=diff
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
|
|
||||||
systemd.timers.pgbackrest-backup-diff = {
|
systemd.services.pgbackrest-backup-diff = {
|
||||||
wantedBy = ["timers.target"];
|
description = "pgBackRest Differential Backup";
|
||||||
timerConfig = {
|
after = ["postgresql.service" "pgbackrest-stanza-create.service"];
|
||||||
OnCalendar = cfg.schedule.diff;
|
requires = ["postgresql.service"];
|
||||||
Persistent = true;
|
wants = ["pgbackrest-stanza-create.service"];
|
||||||
RandomizedDelaySec = "5m";
|
path = [pkgs.pgbackrest];
|
||||||
|
serviceConfig = {
|
||||||
|
Type = "oneshot";
|
||||||
|
User = "postgres";
|
||||||
|
EnvironmentFile = cfg.secretFile;
|
||||||
|
};
|
||||||
|
script = ''
|
||||||
|
pgbackrest --stanza=${cfg.stanza} backup --type=diff
|
||||||
|
'';
|
||||||
};
|
};
|
||||||
};
|
|
||||||
|
|
||||||
systemd.tmpfiles.rules = [
|
systemd.timers.pgbackrest-backup-diff = {
|
||||||
"d /var/lib/pgbackrest 0750 postgres postgres -"
|
wantedBy = ["timers.target"];
|
||||||
"d /var/log/pgbackrest 0750 postgres postgres -"
|
timerConfig = {
|
||||||
"d /var/spool/pgbackrest 0750 postgres postgres -"
|
OnCalendar = cfg.schedule.diff;
|
||||||
];
|
Persistent = true;
|
||||||
});
|
RandomizedDelaySec = "5m";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
systemd.tmpfiles.rules = [
|
||||||
|
"d /var/lib/pgbackrest 0750 postgres postgres -"
|
||||||
|
"d /var/log/pgbackrest 0750 postgres postgres -"
|
||||||
|
"d /var/spool/pgbackrest 0750 postgres postgres -"
|
||||||
|
];
|
||||||
|
});
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,437 +0,0 @@
|
|||||||
{
|
|
||||||
config,
|
|
||||||
lib,
|
|
||||||
pkgs,
|
|
||||||
...
|
|
||||||
}:
|
|
||||||
with lib; let
|
|
||||||
cfg = config.services.syncthing;
|
|
||||||
settingsFormat = pkgs.formats.json {};
|
|
||||||
cleanedConfig = converge (filterAttrsRecursive (_: v: v != null && v != {})) cfg.settings;
|
|
||||||
|
|
||||||
isUnixGui = (builtins.substring 0 1 cfg.guiAddress) == "/";
|
|
||||||
|
|
||||||
curlAddressArgs = path:
|
|
||||||
if isUnixGui
|
|
||||||
then "--unix-socket ${cfg.guiAddress} http://.${path}"
|
|
||||||
else "${cfg.guiAddress}${path}";
|
|
||||||
|
|
||||||
devices = mapAttrsToList (_: device: device // {deviceID = device.id;}) cfg.settings.devices;
|
|
||||||
anyAutoAccept = builtins.any (dev: dev.autoAcceptFolders) devices;
|
|
||||||
|
|
||||||
folders =
|
|
||||||
mapAttrsToList (_: folder:
|
|
||||||
folder
|
|
||||||
// {
|
|
||||||
devices = let
|
|
||||||
folderDevices = folder.devices;
|
|
||||||
in
|
|
||||||
map (
|
|
||||||
device:
|
|
||||||
if builtins.isString device
|
|
||||||
then {deviceId = cfg.settings.devices.${device}.id;}
|
|
||||||
else if builtins.isAttrs device
|
|
||||||
then {deviceId = cfg.settings.devices.${device.name}.id;} // device
|
|
||||||
else throw "Invalid type for devices in folder; expected list or attrset."
|
|
||||||
)
|
|
||||||
folderDevices;
|
|
||||||
}) (filterAttrs (_: folder: folder.enable) cfg.settings.folders);
|
|
||||||
|
|
||||||
jq = "${pkgs.jq}/bin/jq";
|
|
||||||
updateConfig =
|
|
||||||
pkgs.writers.writeBash "merge-syncthing-config" (
|
|
||||||
''
|
|
||||||
set -efu
|
|
||||||
umask 0077
|
|
||||||
|
|
||||||
curl() {
|
|
||||||
while
|
|
||||||
! ${pkgs.libxml2}/bin/xmllint \
|
|
||||||
--xpath 'string(configuration/gui/apikey)' \
|
|
||||||
${cfg.configDir}/config.xml \
|
|
||||||
>"$TMPDIR/api_key"
|
|
||||||
do sleep 1; done
|
|
||||||
(printf "X-API-Key: "; cat "$TMPDIR/api_key") >"$TMPDIR/headers"
|
|
||||||
${pkgs.curl}/bin/curl -sSLk -H "@$TMPDIR/headers" \
|
|
||||||
--retry 1000 --retry-delay 1 --retry-all-errors \
|
|
||||||
"$@"
|
|
||||||
}
|
|
||||||
''
|
|
||||||
+ (lib.pipe {
|
|
||||||
devs = {
|
|
||||||
new_conf_IDs = map (v: v.id) devices;
|
|
||||||
GET_IdAttrName = "deviceID";
|
|
||||||
override = cfg.overrideDevices;
|
|
||||||
conf = devices;
|
|
||||||
baseAddress = curlAddressArgs "/rest/config/devices";
|
|
||||||
};
|
|
||||||
dirs = {
|
|
||||||
new_conf_IDs = map (v: v.id) folders;
|
|
||||||
GET_IdAttrName = "id";
|
|
||||||
override = cfg.overrideFolders;
|
|
||||||
conf = folders;
|
|
||||||
baseAddress = curlAddressArgs "/rest/config/folders";
|
|
||||||
};
|
|
||||||
} [
|
|
||||||
(mapAttrs (
|
|
||||||
conf_type: s:
|
|
||||||
lib.pipe s.conf [
|
|
||||||
(map (
|
|
||||||
new_cfg: let
|
|
||||||
jsonPreSecretsFile =
|
|
||||||
pkgs.writeTextFile {
|
|
||||||
name = "${conf_type}-${new_cfg.id}-conf-pre-secrets.json";
|
|
||||||
text = builtins.toJSON new_cfg;
|
|
||||||
};
|
|
||||||
injectSecretsJqCmd =
|
|
||||||
{
|
|
||||||
"devs" = "${jq} .";
|
|
||||||
"dirs" = let
|
|
||||||
folder = new_cfg;
|
|
||||||
devicesWithSecrets =
|
|
||||||
lib.pipe folder.devices [
|
|
||||||
(lib.filter (device: (builtins.isAttrs device) && device ? encryptionPasswordFile))
|
|
||||||
(map (device: {
|
|
||||||
deviceId = device.deviceId;
|
|
||||||
variableName = "secret_${builtins.hashString "sha256" device.encryptionPasswordFile}";
|
|
||||||
secretPath = device.encryptionPasswordFile;
|
|
||||||
}))
|
|
||||||
];
|
|
||||||
jqUpdates =
|
|
||||||
map (device: ''
|
|
||||||
.devices[] |= (
|
|
||||||
if .deviceId == "${device.deviceId}" then
|
|
||||||
del(.encryptionPasswordFile) |
|
|
||||||
.encryptionPassword = ''$${device.variableName}
|
|
||||||
else
|
|
||||||
.
|
|
||||||
end
|
|
||||||
)
|
|
||||||
'')
|
|
||||||
devicesWithSecrets;
|
|
||||||
jqRawFiles = map (device: "--rawfile ${device.variableName} ${lib.escapeShellArg device.secretPath}") devicesWithSecrets;
|
|
||||||
in "${jq} ${lib.concatStringsSep " " jqRawFiles} ${lib.escapeShellArg (lib.concatStringsSep "|" (["."] ++ jqUpdates))}";
|
|
||||||
}.${
|
|
||||||
conf_type
|
|
||||||
};
|
|
||||||
in ''
|
|
||||||
${injectSecretsJqCmd} ${jsonPreSecretsFile} | curl --json @- -X POST ${s.baseAddress}
|
|
||||||
''
|
|
||||||
))
|
|
||||||
(lib.concatStringsSep "\n")
|
|
||||||
]
|
|
||||||
+ lib.optionalString s.override ''
|
|
||||||
stale_${conf_type}_ids="$(curl -X GET ${s.baseAddress} | ${jq} \
|
|
||||||
--argjson new_ids ${lib.escapeShellArg (builtins.toJSON s.new_conf_IDs)} \
|
|
||||||
--raw-output \
|
|
||||||
'[.[].${s.GET_IdAttrName}] - $new_ids | .[]'
|
|
||||||
)"
|
|
||||||
for id in ''${stale_${conf_type}_ids}; do
|
|
||||||
>&2 echo "Deleting stale device: $id"
|
|
||||||
curl -X DELETE ${s.baseAddress}/$id
|
|
||||||
done
|
|
||||||
''
|
|
||||||
))
|
|
||||||
builtins.attrValues
|
|
||||||
(lib.concatStringsSep "\n")
|
|
||||||
])
|
|
||||||
+ (lib.pipe cleanedConfig [
|
|
||||||
builtins.attrNames
|
|
||||||
(lib.subtractLists ["folders" "devices"])
|
|
||||||
(map (subOption: ''
|
|
||||||
curl -X PUT -d ${lib.escapeShellArg (builtins.toJSON cleanedConfig.${subOption})} ${curlAddressArgs "/rest/config/${subOption}"}
|
|
||||||
''))
|
|
||||||
(lib.concatStringsSep "\n")
|
|
||||||
])
|
|
||||||
+ ''
|
|
||||||
if curl ${curlAddressArgs "/rest/config/restart-required"} |
|
|
||||||
${jq} -e .requiresRestart > /dev/null; then
|
|
||||||
curl -X POST ${curlAddressArgs "/rest/system/restart"}
|
|
||||||
fi
|
|
||||||
''
|
|
||||||
);
|
|
||||||
in {
|
|
||||||
options = {
|
|
||||||
services.syncthing = {
|
|
||||||
enable = mkEnableOption "Syncthing, a self-hosted open-source alternative to Dropbox and Bittorrent Sync";
|
|
||||||
|
|
||||||
cert =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.str;
|
|
||||||
default = null;
|
|
||||||
description = "Path to the cert.pem file, which will be copied into Syncthing's configDir.";
|
|
||||||
};
|
|
||||||
|
|
||||||
key =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.str;
|
|
||||||
default = null;
|
|
||||||
description = "Path to the key.pem file, which will be copied into Syncthing's configDir.";
|
|
||||||
};
|
|
||||||
|
|
||||||
overrideDevices =
|
|
||||||
mkOption {
|
|
||||||
type = types.bool;
|
|
||||||
default = true;
|
|
||||||
description = "Whether to delete the devices which are not configured via the devices option.";
|
|
||||||
};
|
|
||||||
|
|
||||||
overrideFolders =
|
|
||||||
mkOption {
|
|
||||||
type = types.bool;
|
|
||||||
default = !anyAutoAccept;
|
|
||||||
description = "Whether to delete the folders which are not configured via the folders option.";
|
|
||||||
};
|
|
||||||
|
|
||||||
settings =
|
|
||||||
mkOption {
|
|
||||||
type =
|
|
||||||
types.submodule {
|
|
||||||
freeformType = settingsFormat.type;
|
|
||||||
options = {
|
|
||||||
options =
|
|
||||||
mkOption {
|
|
||||||
default = {};
|
|
||||||
description = "The options element contains all other global configuration options";
|
|
||||||
type =
|
|
||||||
types.submodule {
|
|
||||||
freeformType = settingsFormat.type;
|
|
||||||
options = {
|
|
||||||
localAnnounceEnabled =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.bool;
|
|
||||||
default = null;
|
|
||||||
description = "Whether to send announcements to the local LAN.";
|
|
||||||
};
|
|
||||||
globalAnnounceEnabled =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.bool;
|
|
||||||
default = null;
|
|
||||||
description = "Whether to send announcements to the global discovery servers.";
|
|
||||||
};
|
|
||||||
relaysEnabled =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.bool;
|
|
||||||
default = null;
|
|
||||||
description = "When true, relays will be connected to and potentially used for device to device connections.";
|
|
||||||
};
|
|
||||||
urAccepted =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.int;
|
|
||||||
default = null;
|
|
||||||
description = "Whether the user has accepted to submit anonymous usage data.";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
devices =
|
|
||||||
mkOption {
|
|
||||||
default = {};
|
|
||||||
description = "Peers/devices which Syncthing should communicate with.";
|
|
||||||
type =
|
|
||||||
types.attrsOf (types.submodule ({name, ...}: {
|
|
||||||
freeformType = settingsFormat.type;
|
|
||||||
options = {
|
|
||||||
name =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = name;
|
|
||||||
description = "The name of the device.";
|
|
||||||
};
|
|
||||||
id =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
description = "The device ID.";
|
|
||||||
};
|
|
||||||
autoAcceptFolders =
|
|
||||||
mkOption {
|
|
||||||
type = types.bool;
|
|
||||||
default = false;
|
|
||||||
description = "Automatically create or share folders that this device advertises at the default path.";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}));
|
|
||||||
};
|
|
||||||
|
|
||||||
folders =
|
|
||||||
mkOption {
|
|
||||||
default = {};
|
|
||||||
description = "Folders which should be shared by Syncthing.";
|
|
||||||
type =
|
|
||||||
types.attrsOf (types.submodule ({name, ...}: {
|
|
||||||
freeformType = settingsFormat.type;
|
|
||||||
options = {
|
|
||||||
enable =
|
|
||||||
mkOption {
|
|
||||||
type = types.bool;
|
|
||||||
default = true;
|
|
||||||
description = "Whether to share this folder.";
|
|
||||||
};
|
|
||||||
path =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = name;
|
|
||||||
description = "The path to the folder which should be shared.";
|
|
||||||
};
|
|
||||||
id =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = name;
|
|
||||||
description = "The ID of the folder. Must be the same on all devices.";
|
|
||||||
};
|
|
||||||
label =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = name;
|
|
||||||
description = "The label of the folder.";
|
|
||||||
};
|
|
||||||
type =
|
|
||||||
mkOption {
|
|
||||||
type = types.enum ["sendreceive" "sendonly" "receiveonly" "receiveencrypted"];
|
|
||||||
default = "sendreceive";
|
|
||||||
description = "Controls how the folder is handled by Syncthing.";
|
|
||||||
};
|
|
||||||
devices =
|
|
||||||
mkOption {
|
|
||||||
type =
|
|
||||||
types.listOf (types.oneOf [
|
|
||||||
types.str
|
|
||||||
(types.submodule {
|
|
||||||
freeformType = settingsFormat.type;
|
|
||||||
options = {
|
|
||||||
name =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
description = "The name of a device defined in the devices option.";
|
|
||||||
};
|
|
||||||
encryptionPasswordFile =
|
|
||||||
mkOption {
|
|
||||||
type = types.nullOr types.path;
|
|
||||||
default = null;
|
|
||||||
description = "Path to encryption password file.";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
})
|
|
||||||
]);
|
|
||||||
default = [];
|
|
||||||
description = "The devices this folder should be shared with.";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}));
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
default = {};
|
|
||||||
description = "Extra configuration options for Syncthing.";
|
|
||||||
};
|
|
||||||
|
|
||||||
guiAddress =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "127.0.0.1:8384";
|
|
||||||
description = "The address to serve the web interface at.";
|
|
||||||
};
|
|
||||||
|
|
||||||
user =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "syncthing";
|
|
||||||
description = "The user to run Syncthing as.";
|
|
||||||
};
|
|
||||||
|
|
||||||
group =
|
|
||||||
mkOption {
|
|
||||||
type = types.str;
|
|
||||||
default = "syncthing";
|
|
||||||
description = "The group to run Syncthing under.";
|
|
||||||
};
|
|
||||||
|
|
||||||
dataDir =
|
|
||||||
mkOption {
|
|
||||||
type = types.path;
|
|
||||||
default = "/var/lib/syncthing";
|
|
||||||
description = "The path where synchronised directories will exist.";
|
|
||||||
};
|
|
||||||
|
|
||||||
configDir =
|
|
||||||
mkOption {
|
|
||||||
type = types.path;
|
|
||||||
default = cfg.dataDir + "/.config/syncthing";
|
|
||||||
description = "The path where the settings and keys will exist.";
|
|
||||||
};
|
|
||||||
|
|
||||||
openDefaultPorts =
|
|
||||||
mkOption {
|
|
||||||
type = types.bool;
|
|
||||||
default = false;
|
|
||||||
description = "Whether to open the default ports in the firewall (not applicable on Darwin).";
|
|
||||||
};
|
|
||||||
|
|
||||||
package = mkPackageOption pkgs "syncthing" {};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
config =
|
|
||||||
mkIf cfg.enable {
|
|
||||||
assertions = [
|
|
||||||
{
|
|
||||||
assertion = !(cfg.overrideFolders && anyAutoAccept);
|
|
||||||
message = "services.syncthing.overrideFolders will delete auto-accepted folders from the configuration, creating path conflicts.";
|
|
||||||
}
|
|
||||||
];
|
|
||||||
|
|
||||||
environment.systemPackages = [cfg.package];
|
|
||||||
|
|
||||||
launchd.user.agents.syncthing = {
|
|
||||||
serviceConfig = {
|
|
||||||
ProgramArguments = [
|
|
||||||
"${cfg.package}/bin/syncthing"
|
|
||||||
"--no-browser"
|
|
||||||
"--gui-address=${
|
|
||||||
if isUnixGui
|
|
||||||
then "unix://"
|
|
||||||
else ""
|
|
||||||
}${cfg.guiAddress}"
|
|
||||||
"--config=${cfg.configDir}"
|
|
||||||
"--data=${cfg.configDir}"
|
|
||||||
];
|
|
||||||
EnvironmentVariables = {
|
|
||||||
STNORESTART = "yes";
|
|
||||||
STNOUPGRADE = "yes";
|
|
||||||
};
|
|
||||||
KeepAlive = true;
|
|
||||||
RunAtLoad = true;
|
|
||||||
ProcessType = "Background";
|
|
||||||
StandardOutPath = "${cfg.configDir}/syncthing.log";
|
|
||||||
StandardErrorPath = "${cfg.configDir}/syncthing.log";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
launchd.user.agents.syncthing-init =
|
|
||||||
mkIf (cleanedConfig != {}) {
|
|
||||||
serviceConfig = {
|
|
||||||
ProgramArguments = ["${updateConfig}"];
|
|
||||||
RunAtLoad = true;
|
|
||||||
KeepAlive = false;
|
|
||||||
ProcessType = "Background";
|
|
||||||
StandardOutPath = "${cfg.configDir}/syncthing-init.log";
|
|
||||||
StandardErrorPath = "${cfg.configDir}/syncthing-init.log";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
system.activationScripts.syncthing =
|
|
||||||
mkIf (cfg.cert != null || cfg.key != null) ''
|
|
||||||
echo "Setting up Syncthing certificates..."
|
|
||||||
mkdir -p ${cfg.configDir}
|
|
||||||
${optionalString (cfg.cert != null) ''
|
|
||||||
cp ${toString cfg.cert} ${cfg.configDir}/cert.pem
|
|
||||||
chmod 644 ${cfg.configDir}/cert.pem
|
|
||||||
''}
|
|
||||||
${optionalString (cfg.key != null) ''
|
|
||||||
cp ${toString cfg.key} ${cfg.configDir}/key.pem
|
|
||||||
chmod 600 ${cfg.configDir}/key.pem
|
|
||||||
''}
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
}
|
|
||||||
7
overlays/lumen.nix
Normal file
7
overlays/lumen.nix
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
{inputs}: final: prev: {
|
||||||
|
lumen =
|
||||||
|
import ../lib/build-rust-package.nix {
|
||||||
|
inherit prev;
|
||||||
|
input = inputs.lumen;
|
||||||
|
};
|
||||||
|
}
|
||||||
19
overlays/nono.nix
Normal file
19
overlays/nono.nix
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
{inputs}: final: prev: let
|
||||||
|
manifest = (prev.lib.importTOML "${inputs.nono}/Cargo.toml").package;
|
||||||
|
in {
|
||||||
|
nono =
|
||||||
|
prev.rustPlatform.buildRustPackage {
|
||||||
|
pname = manifest.name;
|
||||||
|
version = manifest.version;
|
||||||
|
|
||||||
|
cargoLock.lockFile = "${inputs.nono}/Cargo.lock";
|
||||||
|
|
||||||
|
src = inputs.nono;
|
||||||
|
|
||||||
|
nativeBuildInputs = with prev; [pkg-config];
|
||||||
|
buildInputs = with prev; [openssl dbus];
|
||||||
|
OPENSSL_NO_VENDOR = 1;
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
};
|
||||||
|
}
|
||||||
132
overlays/openusage.nix
Normal file
132
overlays/openusage.nix
Normal file
@@ -0,0 +1,132 @@
|
|||||||
|
{inputs}: final: prev: let
|
||||||
|
version = "0.5.1";
|
||||||
|
in {
|
||||||
|
openusage =
|
||||||
|
prev.rustPlatform.buildRustPackage (finalAttrs: {
|
||||||
|
pname = "openusage";
|
||||||
|
inherit version;
|
||||||
|
|
||||||
|
src = inputs.openusage;
|
||||||
|
|
||||||
|
cargoRoot = "src-tauri";
|
||||||
|
cargoLock = {
|
||||||
|
lockFile = "${inputs.openusage}/src-tauri/Cargo.lock";
|
||||||
|
outputHashes = {
|
||||||
|
"tauri-nspanel-2.1.0" = "sha256-PLACEHOLDER";
|
||||||
|
"tauri-plugin-aptabase-1.0.0" = "sha256-PLACEHOLDER";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
buildAndTestSubdir = finalAttrs.cargoRoot;
|
||||||
|
|
||||||
|
node_modules =
|
||||||
|
prev.stdenv.mkDerivation {
|
||||||
|
inherit (finalAttrs) src version;
|
||||||
|
pname = "${finalAttrs.pname}-node_modules";
|
||||||
|
|
||||||
|
impureEnvVars =
|
||||||
|
prev.lib.fetchers.proxyImpureEnvVars
|
||||||
|
++ [
|
||||||
|
"GIT_PROXY_COMMAND"
|
||||||
|
"SOCKS_SERVER"
|
||||||
|
];
|
||||||
|
|
||||||
|
nativeBuildInputs = [
|
||||||
|
prev.bun
|
||||||
|
prev.writableTmpDirAsHomeHook
|
||||||
|
];
|
||||||
|
|
||||||
|
dontConfigure = true;
|
||||||
|
dontFixup = true;
|
||||||
|
dontPatchShebangs = true;
|
||||||
|
|
||||||
|
buildPhase = ''
|
||||||
|
runHook preBuild
|
||||||
|
|
||||||
|
export BUN_INSTALL_CACHE_DIR=$(mktemp -d)
|
||||||
|
|
||||||
|
bun install \
|
||||||
|
--no-progress \
|
||||||
|
--frozen-lockfile \
|
||||||
|
--ignore-scripts
|
||||||
|
|
||||||
|
runHook postBuild
|
||||||
|
'';
|
||||||
|
|
||||||
|
installPhase = ''
|
||||||
|
runHook preInstall
|
||||||
|
cp -R ./node_modules $out
|
||||||
|
runHook postInstall
|
||||||
|
'';
|
||||||
|
|
||||||
|
outputHash = "sha256-PLACEHOLDER";
|
||||||
|
outputHashMode = "recursive";
|
||||||
|
};
|
||||||
|
|
||||||
|
nativeBuildInputs = [
|
||||||
|
prev.cargo-tauri.hook
|
||||||
|
prev.rustPlatform.bindgenHook
|
||||||
|
prev.bun
|
||||||
|
prev.nodejs
|
||||||
|
prev.pkg-config
|
||||||
|
prev.makeBinaryWrapper
|
||||||
|
];
|
||||||
|
|
||||||
|
buildInputs =
|
||||||
|
prev.lib.optionals prev.stdenv.isDarwin (
|
||||||
|
with prev.darwin.apple_sdk.frameworks; [
|
||||||
|
AppKit
|
||||||
|
CoreFoundation
|
||||||
|
CoreServices
|
||||||
|
Security
|
||||||
|
WebKit
|
||||||
|
]
|
||||||
|
);
|
||||||
|
|
||||||
|
# Disable updater artifact generation — we don't have signing keys.
|
||||||
|
tauriConf = builtins.toJSON {bundle.createUpdaterArtifacts = false;};
|
||||||
|
passAsFile = ["tauriConf"];
|
||||||
|
preBuild = ''
|
||||||
|
tauriBuildFlags+=(
|
||||||
|
"--config"
|
||||||
|
"$tauriConfPath"
|
||||||
|
)
|
||||||
|
'';
|
||||||
|
|
||||||
|
configurePhase = ''
|
||||||
|
runHook preConfigure
|
||||||
|
|
||||||
|
# Copy pre-fetched node_modules
|
||||||
|
cp -R ${finalAttrs.node_modules} node_modules/
|
||||||
|
chmod -R u+rw node_modules
|
||||||
|
chmod -R u+x node_modules/.bin
|
||||||
|
patchShebangs node_modules
|
||||||
|
|
||||||
|
export HOME=$TMPDIR
|
||||||
|
export PATH="$PWD/node_modules/.bin:$PATH"
|
||||||
|
|
||||||
|
# Bundle plugins (copy from plugins/ to src-tauri/resources/bundled_plugins/)
|
||||||
|
${prev.nodejs}/bin/node copy-bundled.cjs
|
||||||
|
|
||||||
|
runHook postConfigure
|
||||||
|
'';
|
||||||
|
|
||||||
|
env = {
|
||||||
|
OPENSSL_NO_VENDOR = true;
|
||||||
|
};
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
|
||||||
|
postInstall =
|
||||||
|
prev.lib.optionalString prev.stdenv.isDarwin ''
|
||||||
|
makeWrapper $out/Applications/OpenUsage.app/Contents/MacOS/OpenUsage $out/bin/openusage
|
||||||
|
'';
|
||||||
|
|
||||||
|
meta = {
|
||||||
|
description = "Track all your AI coding subscriptions in one place";
|
||||||
|
homepage = "https://github.com/robinebers/openusage";
|
||||||
|
license = prev.lib.licenses.mit;
|
||||||
|
platforms = prev.lib.platforms.darwin;
|
||||||
|
mainProgram = "openusage";
|
||||||
|
};
|
||||||
|
});
|
||||||
|
}
|
||||||
101
overlays/overseer.nix
Normal file
101
overlays/overseer.nix
Normal file
@@ -0,0 +1,101 @@
|
|||||||
|
{inputs}: final: prev: let
|
||||||
|
manifest = (prev.lib.importTOML "${inputs.overseer}/overseer/Cargo.toml").package;
|
||||||
|
|
||||||
|
overseer-cli =
|
||||||
|
prev.rustPlatform.buildRustPackage {
|
||||||
|
pname = "overseer-cli";
|
||||||
|
version = manifest.version;
|
||||||
|
|
||||||
|
cargoLock.lockFile = "${inputs.overseer}/overseer/Cargo.lock";
|
||||||
|
|
||||||
|
src = "${inputs.overseer}/overseer";
|
||||||
|
|
||||||
|
nativeBuildInputs = with prev; [
|
||||||
|
pkg-config
|
||||||
|
];
|
||||||
|
|
||||||
|
buildInputs = with prev; [
|
||||||
|
openssl
|
||||||
|
];
|
||||||
|
|
||||||
|
OPENSSL_NO_VENDOR = 1;
|
||||||
|
|
||||||
|
doCheck = false;
|
||||||
|
};
|
||||||
|
|
||||||
|
overseer-host =
|
||||||
|
prev.buildNpmPackage {
|
||||||
|
pname = "overseer-host";
|
||||||
|
version = manifest.version;
|
||||||
|
|
||||||
|
src = "${inputs.overseer}/host";
|
||||||
|
|
||||||
|
npmDepsHash = "sha256-WIjx6N8vnH3C6Kxn4tiryi3bM0xnov5ok2k9XrndIS0=";
|
||||||
|
|
||||||
|
buildPhase = ''
|
||||||
|
runHook preBuild
|
||||||
|
npm run build
|
||||||
|
runHook postBuild
|
||||||
|
'';
|
||||||
|
|
||||||
|
installPhase = ''
|
||||||
|
runHook preInstall
|
||||||
|
mkdir -p $out
|
||||||
|
cp -r dist $out/
|
||||||
|
cp -r node_modules $out/
|
||||||
|
cp package.json $out/
|
||||||
|
runHook postInstall
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
|
||||||
|
overseer-ui =
|
||||||
|
prev.buildNpmPackage {
|
||||||
|
pname = "overseer-ui";
|
||||||
|
version = manifest.version;
|
||||||
|
|
||||||
|
src = "${inputs.overseer}/ui";
|
||||||
|
|
||||||
|
npmDepsHash = "sha256-krOsSd8OAPsdCOCf1bcz9c/Myj6jpHOkaD/l+R7PQpY=";
|
||||||
|
|
||||||
|
buildPhase = ''
|
||||||
|
runHook preBuild
|
||||||
|
npm run build
|
||||||
|
runHook postBuild
|
||||||
|
'';
|
||||||
|
|
||||||
|
installPhase = ''
|
||||||
|
runHook preInstall
|
||||||
|
mkdir -p $out
|
||||||
|
cp -r dist $out/
|
||||||
|
runHook postInstall
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
in {
|
||||||
|
# The CLI looks for host/dist/index.js and ui/dist relative to the binary
|
||||||
|
# Using paths like: exe_dir.join("../@dmmulroy/overseer/host/dist/index.js")
|
||||||
|
# So we create: bin/os and @dmmulroy/overseer/host/dist/index.js
|
||||||
|
overseer =
|
||||||
|
prev.runCommand "overseer-${manifest.version}" {
|
||||||
|
nativeBuildInputs = [prev.makeWrapper];
|
||||||
|
} ''
|
||||||
|
# Create npm-like structure that the CLI expects
|
||||||
|
mkdir -p $out/bin
|
||||||
|
mkdir -p $out/@dmmulroy/overseer/host
|
||||||
|
mkdir -p $out/@dmmulroy/overseer/ui
|
||||||
|
|
||||||
|
# Copy host files
|
||||||
|
cp -r ${overseer-host}/dist $out/@dmmulroy/overseer/host/
|
||||||
|
cp -r ${overseer-host}/node_modules $out/@dmmulroy/overseer/host/
|
||||||
|
cp ${overseer-host}/package.json $out/@dmmulroy/overseer/host/
|
||||||
|
|
||||||
|
# Copy UI files
|
||||||
|
cp -r ${overseer-ui}/dist $out/@dmmulroy/overseer/ui/
|
||||||
|
|
||||||
|
# Copy CLI binary
|
||||||
|
cp ${overseer-cli}/bin/os $out/bin/os
|
||||||
|
|
||||||
|
# Make wrapper that ensures node is available
|
||||||
|
wrapProgram $out/bin/os \
|
||||||
|
--prefix PATH : ${prev.nodejs}/bin
|
||||||
|
'';
|
||||||
|
}
|
||||||
142
profiles/aerospace.nix
Normal file
142
profiles/aerospace.nix
Normal file
@@ -0,0 +1,142 @@
|
|||||||
|
{
|
||||||
|
programs.aerospace = {
|
||||||
|
enable = true;
|
||||||
|
launchd.enable = true;
|
||||||
|
settings = {
|
||||||
|
start-at-login = true;
|
||||||
|
accordion-padding = 30;
|
||||||
|
default-root-container-layout = "tiles";
|
||||||
|
default-root-container-orientation = "auto";
|
||||||
|
on-focused-monitor-changed = [
|
||||||
|
"move-mouse monitor-lazy-center"
|
||||||
|
];
|
||||||
|
|
||||||
|
workspace-to-monitor-force-assignment = {
|
||||||
|
"1" = "secondary";
|
||||||
|
"2" = "secondary";
|
||||||
|
"3" = "secondary";
|
||||||
|
"4" = "secondary";
|
||||||
|
"5" = "secondary";
|
||||||
|
"6" = "secondary";
|
||||||
|
"7" = "secondary";
|
||||||
|
"8" = "secondary";
|
||||||
|
"9" = "main";
|
||||||
|
};
|
||||||
|
|
||||||
|
gaps = {
|
||||||
|
inner = {
|
||||||
|
horizontal = 8;
|
||||||
|
vertical = 8;
|
||||||
|
};
|
||||||
|
outer = {
|
||||||
|
left = 8;
|
||||||
|
right = 8;
|
||||||
|
top = 8;
|
||||||
|
bottom = 8;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
on-window-detected = [
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "com.apple.systempreferences";
|
||||||
|
};
|
||||||
|
run = "layout floating";
|
||||||
|
}
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "com.mitchellh.ghostty";
|
||||||
|
};
|
||||||
|
run = ["layout tiling" "move-node-to-workspace 3"];
|
||||||
|
}
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "net.imput.helium";
|
||||||
|
};
|
||||||
|
run = "move-node-to-workspace 2";
|
||||||
|
}
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "com.tinyspeck.slackmacgap";
|
||||||
|
};
|
||||||
|
run = "move-node-to-workspace 5";
|
||||||
|
}
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "net.whatsapp.WhatsApp";
|
||||||
|
};
|
||||||
|
run = "move-node-to-workspace 5";
|
||||||
|
}
|
||||||
|
{
|
||||||
|
"if" = {
|
||||||
|
"app-id" = "com.tidal.desktop";
|
||||||
|
};
|
||||||
|
run = "move-node-to-workspace 6";
|
||||||
|
}
|
||||||
|
];
|
||||||
|
|
||||||
|
mode = {
|
||||||
|
main.binding = {
|
||||||
|
"alt-enter" = "exec-and-forget open -a Ghostty";
|
||||||
|
"alt-h" = "focus left";
|
||||||
|
"alt-j" = "focus down";
|
||||||
|
"alt-k" = "focus up";
|
||||||
|
"alt-l" = "focus right";
|
||||||
|
"alt-shift-h" = "move left";
|
||||||
|
"alt-shift-j" = "move down";
|
||||||
|
"alt-shift-k" = "move up";
|
||||||
|
"alt-shift-l" = "move right";
|
||||||
|
"alt-ctrl-h" = "focus-monitor --wrap-around left";
|
||||||
|
"alt-ctrl-j" = "focus-monitor --wrap-around down";
|
||||||
|
"alt-ctrl-k" = "focus-monitor --wrap-around up";
|
||||||
|
"alt-ctrl-l" = "focus-monitor --wrap-around right";
|
||||||
|
"alt-ctrl-shift-h" = "move-node-to-monitor --focus-follows-window --wrap-around left";
|
||||||
|
"alt-ctrl-shift-j" = "move-node-to-monitor --focus-follows-window --wrap-around down";
|
||||||
|
"alt-ctrl-shift-k" = "move-node-to-monitor --focus-follows-window --wrap-around up";
|
||||||
|
"alt-ctrl-shift-l" = "move-node-to-monitor --focus-follows-window --wrap-around right";
|
||||||
|
"alt-space" = "layout tiles accordion";
|
||||||
|
"alt-shift-space" = "layout floating tiling";
|
||||||
|
"alt-slash" = "layout horizontal vertical";
|
||||||
|
"alt-f" = "fullscreen";
|
||||||
|
"alt-tab" = "workspace-back-and-forth";
|
||||||
|
"alt-shift-tab" = "move-workspace-to-monitor --wrap-around next";
|
||||||
|
"alt-r" = "mode resize";
|
||||||
|
"alt-shift-semicolon" = "mode service";
|
||||||
|
"alt-1" = "workspace 1";
|
||||||
|
"alt-2" = "workspace 2";
|
||||||
|
"alt-3" = "workspace 3";
|
||||||
|
"alt-4" = "workspace 4";
|
||||||
|
"alt-5" = "workspace 5";
|
||||||
|
"alt-6" = "workspace 6";
|
||||||
|
"alt-7" = "workspace 7";
|
||||||
|
"alt-8" = "workspace 8";
|
||||||
|
"alt-9" = "workspace 9";
|
||||||
|
"alt-shift-1" = "move-node-to-workspace --focus-follows-window 1";
|
||||||
|
"alt-shift-2" = "move-node-to-workspace --focus-follows-window 2";
|
||||||
|
"alt-shift-3" = "move-node-to-workspace --focus-follows-window 3";
|
||||||
|
"alt-shift-4" = "move-node-to-workspace --focus-follows-window 4";
|
||||||
|
"alt-shift-5" = "move-node-to-workspace --focus-follows-window 5";
|
||||||
|
"alt-shift-6" = "move-node-to-workspace --focus-follows-window 6";
|
||||||
|
"alt-shift-7" = "move-node-to-workspace --focus-follows-window 7";
|
||||||
|
"alt-shift-8" = "move-node-to-workspace --focus-follows-window 8";
|
||||||
|
"alt-shift-9" = "move-node-to-workspace --focus-follows-window 9";
|
||||||
|
};
|
||||||
|
resize.binding = {
|
||||||
|
"h" = "resize width -50";
|
||||||
|
"j" = "resize height +50";
|
||||||
|
"k" = "resize height -50";
|
||||||
|
"l" = "resize width +50";
|
||||||
|
"enter" = "mode main";
|
||||||
|
"esc" = "mode main";
|
||||||
|
};
|
||||||
|
service.binding = {
|
||||||
|
"esc" = "mode main";
|
||||||
|
"r" = ["reload-config" "mode main"];
|
||||||
|
"b" = ["balance-sizes" "mode main"];
|
||||||
|
"f" = ["layout floating tiling" "mode main"];
|
||||||
|
"backspace" = ["close-all-windows-but-current" "mode main"];
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
programs.atuin = {
|
programs.atuin = {
|
||||||
enable = true;
|
enable = true;
|
||||||
enableFishIntegration = true;
|
enableNushellIntegration = true;
|
||||||
flags = [
|
flags = [
|
||||||
"--disable-up-arrow"
|
"--disable-up-arrow"
|
||||||
];
|
];
|
||||||
|
|||||||
9
profiles/claude-code.nix
Normal file
9
profiles/claude-code.nix
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
{
|
||||||
|
inputs,
|
||||||
|
pkgs,
|
||||||
|
...
|
||||||
|
}: {
|
||||||
|
home.packages = [
|
||||||
|
inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.claude-code
|
||||||
|
];
|
||||||
|
}
|
||||||
@@ -1,5 +1,6 @@
|
|||||||
{pkgs, ...}: {
|
{pkgs, ...}: {
|
||||||
programs.fish.enable = true;
|
programs.fish.enable = true;
|
||||||
|
environment.shells = [pkgs.nushell];
|
||||||
|
|
||||||
nixpkgs = {
|
nixpkgs = {
|
||||||
config = {
|
config = {
|
||||||
|
|||||||
@@ -1,8 +1,8 @@
|
|||||||
{
|
{
|
||||||
constants,
|
|
||||||
inputs,
|
|
||||||
pkgs,
|
pkgs,
|
||||||
|
inputs,
|
||||||
user,
|
user,
|
||||||
|
constants,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
home-manager.extraSpecialArgs = {inherit user constants inputs;};
|
home-manager.extraSpecialArgs = {inherit user constants inputs;};
|
||||||
@@ -13,7 +13,6 @@
|
|||||||
|
|
||||||
defaults = {
|
defaults = {
|
||||||
NSGlobalDomain = {
|
NSGlobalDomain = {
|
||||||
# null equals "Light"
|
|
||||||
AppleInterfaceStyle = null;
|
AppleInterfaceStyle = null;
|
||||||
AppleShowAllExtensions = true;
|
AppleShowAllExtensions = true;
|
||||||
ApplePressAndHoldEnabled = false;
|
ApplePressAndHoldEnabled = false;
|
||||||
@@ -22,6 +21,17 @@
|
|||||||
"com.apple.mouse.tapBehavior" = 1;
|
"com.apple.mouse.tapBehavior" = 1;
|
||||||
"com.apple.sound.beep.volume" = 0.0;
|
"com.apple.sound.beep.volume" = 0.0;
|
||||||
"com.apple.sound.beep.feedback" = 0;
|
"com.apple.sound.beep.feedback" = 0;
|
||||||
|
AppleShowScrollBars = "WhenScrolling";
|
||||||
|
NSAutomaticCapitalizationEnabled = false;
|
||||||
|
NSAutomaticDashSubstitutionEnabled = false;
|
||||||
|
NSAutomaticPeriodSubstitutionEnabled = false;
|
||||||
|
NSAutomaticQuoteSubstitutionEnabled = false;
|
||||||
|
NSAutomaticSpellingCorrectionEnabled = false;
|
||||||
|
NSDocumentSaveNewDocumentsToCloud = false;
|
||||||
|
NSNavPanelExpandedStateForSaveMode = true;
|
||||||
|
NSNavPanelExpandedStateForSaveMode2 = true;
|
||||||
|
PMPrintingExpandedStateForPrint = true;
|
||||||
|
PMPrintingExpandedStateForPrint2 = true;
|
||||||
};
|
};
|
||||||
|
|
||||||
dock = {
|
dock = {
|
||||||
@@ -30,16 +40,68 @@
|
|||||||
launchanim = true;
|
launchanim = true;
|
||||||
orientation = "bottom";
|
orientation = "bottom";
|
||||||
tilesize = 60;
|
tilesize = 60;
|
||||||
|
minimize-to-application = true;
|
||||||
|
mru-spaces = false;
|
||||||
|
expose-group-apps = true;
|
||||||
|
wvous-bl-corner = 1;
|
||||||
|
wvous-br-corner = 1;
|
||||||
|
wvous-tl-corner = 1;
|
||||||
|
wvous-tr-corner = 1;
|
||||||
};
|
};
|
||||||
|
|
||||||
finder = {
|
finder = {
|
||||||
_FXShowPosixPathInTitle = false;
|
_FXShowPosixPathInTitle = false;
|
||||||
|
AppleShowAllFiles = true;
|
||||||
|
FXEnableExtensionChangeWarning = false;
|
||||||
|
FXPreferredViewStyle = "clmv";
|
||||||
|
ShowPathbar = true;
|
||||||
|
ShowStatusBar = true;
|
||||||
};
|
};
|
||||||
|
|
||||||
trackpad = {
|
trackpad = {
|
||||||
Clicking = true;
|
Clicking = true;
|
||||||
TrackpadThreeFingerDrag = true;
|
TrackpadThreeFingerDrag = true;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
screencapture = {
|
||||||
|
location = "~/Screenshots";
|
||||||
|
type = "png";
|
||||||
|
disable-shadow = true;
|
||||||
|
};
|
||||||
|
|
||||||
|
screensaver = {
|
||||||
|
askForPassword = true;
|
||||||
|
askForPasswordDelay = 5;
|
||||||
|
};
|
||||||
|
|
||||||
|
loginwindow = {
|
||||||
|
GuestEnabled = false;
|
||||||
|
DisableConsoleAccess = true;
|
||||||
|
};
|
||||||
|
|
||||||
|
spaces.spans-displays = false;
|
||||||
|
|
||||||
|
WindowManager.StandardHideWidgets = true;
|
||||||
|
|
||||||
|
menuExtraClock = {
|
||||||
|
Show24Hour = true;
|
||||||
|
ShowDate = 1;
|
||||||
|
ShowDayOfWeek = true;
|
||||||
|
ShowSeconds = false;
|
||||||
|
};
|
||||||
|
|
||||||
|
CustomUserPreferences = {
|
||||||
|
"com.apple.desktopservices" = {
|
||||||
|
DSDontWriteNetworkStores = true;
|
||||||
|
DSDontWriteUSBStores = true;
|
||||||
|
};
|
||||||
|
"com.apple.AdLib" = {
|
||||||
|
allowApplePersonalizedAdvertising = false;
|
||||||
|
};
|
||||||
|
"com.apple.Spotlight" = {
|
||||||
|
MenuItemHidden = true;
|
||||||
|
};
|
||||||
|
};
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|
||||||
@@ -53,10 +115,10 @@
|
|||||||
};
|
};
|
||||||
|
|
||||||
users.users.${user} = {
|
users.users.${user} = {
|
||||||
name = "${user}";
|
name = user;
|
||||||
home = "/Users/${user}";
|
home = "/Users/${user}";
|
||||||
isHidden = false;
|
isHidden = false;
|
||||||
shell = pkgs.fish;
|
shell = pkgs.nushell;
|
||||||
};
|
};
|
||||||
|
|
||||||
home-manager.useGlobalPkgs = true;
|
home-manager.useGlobalPkgs = true;
|
||||||
|
|||||||
@@ -1,7 +1,7 @@
|
|||||||
{
|
{
|
||||||
config,
|
config,
|
||||||
pkgs,
|
|
||||||
lib,
|
lib,
|
||||||
|
pkgs,
|
||||||
user,
|
user,
|
||||||
...
|
...
|
||||||
}:
|
}:
|
||||||
@@ -39,7 +39,7 @@ in {
|
|||||||
});
|
});
|
||||||
default = [
|
default = [
|
||||||
{path = "/Applications/Helium.app/";}
|
{path = "/Applications/Helium.app/";}
|
||||||
{path = "${config.users.users.${user}.home}/Applications/Home Manager Apps/Ghostty.app/";}
|
{path = "/Applications/Ghostty.app/";}
|
||||||
{path = "/System/Applications/Calendar.app/";}
|
{path = "/System/Applications/Calendar.app/";}
|
||||||
{path = "/System/Applications/Mail.app/";}
|
{path = "/System/Applications/Mail.app/";}
|
||||||
{path = "/System/Applications/Notes.app/";}
|
{path = "/System/Applications/Notes.app/";}
|
||||||
|
|||||||
@@ -1,6 +0,0 @@
|
|||||||
{
|
|
||||||
programs.eza = {
|
|
||||||
enable = true;
|
|
||||||
enableFishIntegration = true;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,54 +0,0 @@
|
|||||||
{
|
|
||||||
programs.fish = {
|
|
||||||
enable = true;
|
|
||||||
functions = {
|
|
||||||
open_project = ''
|
|
||||||
set -l base "$HOME/Projects"
|
|
||||||
set -l choice (fd -t d -d 1 -a . "$base/Personal" "$base/Work" \
|
|
||||||
| string replace -r -- "^$base/" "" \
|
|
||||||
| fzf --prompt "project > ")
|
|
||||||
test -n "$choice"; and cd "$base/$choice"
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
interactiveShellInit = ''
|
|
||||||
set fish_greeting
|
|
||||||
|
|
||||||
set fish_color_normal 4c4f69
|
|
||||||
set fish_color_command 1e66f5
|
|
||||||
set fish_color_param dd7878
|
|
||||||
set fish_color_keyword d20f39
|
|
||||||
set fish_color_quote 40a02b
|
|
||||||
set fish_color_redirection ea76cb
|
|
||||||
set fish_color_end fe640b
|
|
||||||
set fish_color_comment 8c8fa1
|
|
||||||
set fish_color_error d20f39
|
|
||||||
set fish_color_gray 9ca0b0
|
|
||||||
set fish_color_selection --background=ccd0da
|
|
||||||
set fish_color_search_match --background=ccd0da
|
|
||||||
set fish_color_option 40a02b
|
|
||||||
set fish_color_operator ea76cb
|
|
||||||
set fish_color_escape e64553
|
|
||||||
set fish_color_autosuggestion 9ca0b0
|
|
||||||
set fish_color_cancel d20f39
|
|
||||||
set fish_color_cwd df8e1d
|
|
||||||
set fish_color_user 179299
|
|
||||||
set fish_color_host 1e66f5
|
|
||||||
set fish_color_host_remote 40a02b
|
|
||||||
set fish_color_status d20f39
|
|
||||||
set fish_pager_color_progress 9ca0b0
|
|
||||||
set fish_pager_color_prefix ea76cb
|
|
||||||
set fish_pager_color_completion 4c4f69
|
|
||||||
set fish_pager_color_description 9ca0b0
|
|
||||||
|
|
||||||
set -gx LS_COLORS "$(vivid generate catppuccin-latte)"
|
|
||||||
|
|
||||||
set -gx COLORTERM truecolor
|
|
||||||
set -gx COLORFGBG "15;0"
|
|
||||||
set -gx TERM_BACKGROUND light
|
|
||||||
|
|
||||||
for mode in default insert
|
|
||||||
bind --mode $mode \cp open_project
|
|
||||||
end
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,7 +1,6 @@
|
|||||||
{
|
{
|
||||||
programs.fzf = {
|
programs.fzf = {
|
||||||
enable = true;
|
enable = true;
|
||||||
enableFishIntegration = true;
|
|
||||||
};
|
};
|
||||||
|
|
||||||
home.sessionVariables = {
|
home.sessionVariables = {
|
||||||
|
|||||||
@@ -1,22 +1,70 @@
|
|||||||
{pkgs, ...}: {
|
{pkgs, ...}: {
|
||||||
programs.ghostty = {
|
xdg.configFile."ghostty/config".text = ''
|
||||||
enable = true;
|
command = ${pkgs.nushell}/bin/nu
|
||||||
package = pkgs.ghostty-bin;
|
theme = Catppuccin Latte
|
||||||
settings = {
|
window-padding-x = 12
|
||||||
command = "${pkgs.fish}/bin/fish";
|
window-padding-y = 3
|
||||||
theme = "Catppuccin Latte";
|
window-padding-balance = true
|
||||||
window-padding-x = 12;
|
font-family = TX-02
|
||||||
window-padding-y = 3;
|
font-size = 16.5
|
||||||
window-padding-balance = true;
|
cursor-style = block
|
||||||
font-family = "TX-02 SemiCondensed";
|
mouse-hide-while-typing = true
|
||||||
font-size = 16.5;
|
mouse-scroll-multiplier = 1.25
|
||||||
cursor-style = "block";
|
shell-integration = none
|
||||||
mouse-hide-while-typing = true;
|
shell-integration-features = no-cursor
|
||||||
mouse-scroll-multiplier = 1.25;
|
clipboard-read = allow
|
||||||
shell-integration = "detect";
|
clipboard-write = allow
|
||||||
shell-integration-features = "no-cursor";
|
|
||||||
clipboard-read = "allow";
|
keybind = ctrl+t>n=new_tab
|
||||||
clipboard-write = "allow";
|
keybind = ctrl+t>x=close_tab
|
||||||
};
|
keybind = ctrl+t>h=previous_tab
|
||||||
};
|
keybind = ctrl+t>left=previous_tab
|
||||||
|
keybind = ctrl+t>k=previous_tab
|
||||||
|
keybind = ctrl+t>up=previous_tab
|
||||||
|
keybind = ctrl+t>l=next_tab
|
||||||
|
keybind = ctrl+t>right=next_tab
|
||||||
|
keybind = ctrl+t>j=next_tab
|
||||||
|
keybind = ctrl+t>down=next_tab
|
||||||
|
keybind = ctrl+t>tab=last_tab
|
||||||
|
keybind = ctrl+t>one=goto_tab:1
|
||||||
|
keybind = ctrl+t>two=goto_tab:2
|
||||||
|
keybind = ctrl+t>three=goto_tab:3
|
||||||
|
keybind = ctrl+t>four=goto_tab:4
|
||||||
|
keybind = ctrl+t>five=goto_tab:5
|
||||||
|
keybind = ctrl+t>six=goto_tab:6
|
||||||
|
keybind = ctrl+t>seven=goto_tab:7
|
||||||
|
keybind = ctrl+t>eight=goto_tab:8
|
||||||
|
keybind = ctrl+t>nine=goto_tab:9
|
||||||
|
|
||||||
|
keybind = ctrl+p>n=new_split:auto
|
||||||
|
keybind = ctrl+p>d=new_split:down
|
||||||
|
keybind = ctrl+p>r=new_split:right
|
||||||
|
keybind = ctrl+p>x=close_surface
|
||||||
|
keybind = ctrl+p>f=toggle_split_zoom
|
||||||
|
keybind = ctrl+p>h=goto_split:left
|
||||||
|
keybind = ctrl+p>left=goto_split:left
|
||||||
|
keybind = ctrl+p>l=goto_split:right
|
||||||
|
keybind = ctrl+p>right=goto_split:right
|
||||||
|
keybind = ctrl+p>j=goto_split:down
|
||||||
|
keybind = ctrl+p>down=goto_split:down
|
||||||
|
keybind = ctrl+p>k=goto_split:up
|
||||||
|
keybind = ctrl+p>up=goto_split:up
|
||||||
|
keybind = ctrl+p>equal=equalize_splits
|
||||||
|
|
||||||
|
keybind = ctrl+n>h=resize_split:left,10
|
||||||
|
keybind = ctrl+n>left=resize_split:left,10
|
||||||
|
keybind = ctrl+n>j=resize_split:down,10
|
||||||
|
keybind = ctrl+n>down=resize_split:down,10
|
||||||
|
keybind = ctrl+n>k=resize_split:up,10
|
||||||
|
keybind = ctrl+n>up=resize_split:up,10
|
||||||
|
keybind = ctrl+n>l=resize_split:right,10
|
||||||
|
keybind = ctrl+n>right=resize_split:right,10
|
||||||
|
keybind = ctrl+n>equal=equalize_splits
|
||||||
|
|
||||||
|
keybind = alt+n=new_split:auto
|
||||||
|
keybind = alt+h=goto_split:left
|
||||||
|
keybind = alt+l=goto_split:right
|
||||||
|
keybind = alt+j=goto_split:down
|
||||||
|
keybind = alt+k=goto_split:up
|
||||||
|
'';
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -12,6 +12,11 @@ in {
|
|||||||
autocrlf = "input";
|
autocrlf = "input";
|
||||||
pager = "delta";
|
pager = "delta";
|
||||||
};
|
};
|
||||||
|
credential = {
|
||||||
|
helper = "!gh auth git-credential";
|
||||||
|
"https://github.com".useHttpPath = true;
|
||||||
|
"https://gist.github.com".useHttpPath = true;
|
||||||
|
};
|
||||||
pull.rebase = true;
|
pull.rebase = true;
|
||||||
rebase.autoStash = true;
|
rebase.autoStash = true;
|
||||||
interactive.diffFilter = "delta --color-only";
|
interactive.diffFilter = "delta --color-only";
|
||||||
@@ -90,15 +95,10 @@ in {
|
|||||||
gf = "git fetch";
|
gf = "git fetch";
|
||||||
gfa = "git fetch --all --tags --prune";
|
gfa = "git fetch --all --tags --prune";
|
||||||
gfo = "git fetch origin";
|
gfo = "git fetch origin";
|
||||||
gfg = "git ls-files | grep";
|
|
||||||
gg = "git gui citool";
|
gg = "git gui citool";
|
||||||
gga = "git gui citool --amend";
|
gga = "git gui citool --amend";
|
||||||
ggpull = "git pull origin \"$(git branch --show-current)\"";
|
|
||||||
ggpush = "git push origin \"$(git branch --show-current)\"";
|
|
||||||
ggsup = "git branch --set-upstream-to=origin/$(git branch --show-current)";
|
|
||||||
ghh = "git help";
|
ghh = "git help";
|
||||||
gignore = "git update-index --assume-unchanged";
|
gignore = "git update-index --assume-unchanged";
|
||||||
gignored = "git ls-files -v | grep \"^[[:lower:]]\"";
|
|
||||||
gl = "git pull";
|
gl = "git pull";
|
||||||
glg = "git log --stat";
|
glg = "git log --stat";
|
||||||
glgp = "git log --stat --patch";
|
glgp = "git log --stat --patch";
|
||||||
@@ -113,7 +113,6 @@ in {
|
|||||||
glols = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ar) %C(bold blue)<%an>%Creset\" --stat";
|
glols = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ar) %C(bold blue)<%an>%Creset\" --stat";
|
||||||
glod = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ad) %C(bold blue)<%an>%Creset\"";
|
glod = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ad) %C(bold blue)<%an>%Creset\"";
|
||||||
glods = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ad) %C(bold blue)<%an>%Creset\" --date=short";
|
glods = "git log --graph --pretty=\"%Cred%h%Creset -%C(auto)%d%Creset %s %Cgreen(%ad) %C(bold blue)<%an>%Creset\" --date=short";
|
||||||
gluc = "git pull upstream $(git branch --show-current)";
|
|
||||||
glum = "git pull upstream main";
|
glum = "git pull upstream main";
|
||||||
gm = "git merge";
|
gm = "git merge";
|
||||||
gma = "git merge --abort";
|
gma = "git merge --abort";
|
||||||
@@ -128,7 +127,6 @@ in {
|
|||||||
gpd = "git push --dry-run";
|
gpd = "git push --dry-run";
|
||||||
gpf = "git push --force-with-lease";
|
gpf = "git push --force-with-lease";
|
||||||
gpod = "git push origin --delete";
|
gpod = "git push origin --delete";
|
||||||
gpoat = "git push origin --all && git push origin --tags";
|
|
||||||
gpr = "git pull --rebase";
|
gpr = "git pull --rebase";
|
||||||
gpra = "git pull --rebase --autostash";
|
gpra = "git pull --rebase --autostash";
|
||||||
gprav = "git pull --rebase --autostash -v";
|
gprav = "git pull --rebase --autostash -v";
|
||||||
@@ -137,8 +135,6 @@ in {
|
|||||||
gprv = "git pull --rebase -v";
|
gprv = "git pull --rebase -v";
|
||||||
gprum = "git pull --rebase upstream main";
|
gprum = "git pull --rebase upstream main";
|
||||||
gprumi = "git pull --rebase=interactive upstream main";
|
gprumi = "git pull --rebase=interactive upstream main";
|
||||||
gpsup = "git push --set-upstream origin $(git branch --show-current)";
|
|
||||||
gpsupf = "git push --set-upstream origin $(git branch --show-current) --force-with-lease";
|
|
||||||
gpv = "git push --verbose";
|
gpv = "git push --verbose";
|
||||||
gpu = "git push upstream";
|
gpu = "git push upstream";
|
||||||
gr = "git remote";
|
gr = "git remote";
|
||||||
@@ -164,13 +160,11 @@ in {
|
|||||||
grm = "git rm";
|
grm = "git rm";
|
||||||
grmc = "git rm --cached";
|
grmc = "git rm --cached";
|
||||||
grmv = "git remote rename";
|
grmv = "git remote rename";
|
||||||
groh = "git reset origin/$(git branch --show-current) --hard";
|
|
||||||
grrm = "git remote remove";
|
grrm = "git remote remove";
|
||||||
grs = "git restore";
|
grs = "git restore";
|
||||||
grset = "git remote set-url";
|
grset = "git remote set-url";
|
||||||
grss = "git restore --source";
|
grss = "git restore --source";
|
||||||
grst = "git restore --staged";
|
grst = "git restore --staged";
|
||||||
grt = "cd \"$(git rev-parse --show-toplevel || echo .)\"";
|
|
||||||
gru = "git reset --";
|
gru = "git reset --";
|
||||||
grup = "git remote update";
|
grup = "git remote update";
|
||||||
grv = "git remote --verbose";
|
grv = "git remote --verbose";
|
||||||
@@ -196,16 +190,43 @@ in {
|
|||||||
gswm = "git switch main";
|
gswm = "git switch main";
|
||||||
gta = "git tag --annotate";
|
gta = "git tag --annotate";
|
||||||
gts = "git tag --sign";
|
gts = "git tag --sign";
|
||||||
gtv = "git tag | sort -V";
|
|
||||||
gunignore = "git update-index --no-assume-unchanged";
|
gunignore = "git update-index --no-assume-unchanged";
|
||||||
gunwip = "git rev-list --max-count=1 --format=\"%s\" HEAD | grep -q \"\\--wip--\" && git reset HEAD~1";
|
|
||||||
gwch = "git whatchanged -p --abbrev-commit --pretty=medium";
|
gwch = "git whatchanged -p --abbrev-commit --pretty=medium";
|
||||||
gwipe = "git reset --hard && git clean --force -df";
|
|
||||||
gwt = "git worktree";
|
gwt = "git worktree";
|
||||||
gwta = "git worktree add";
|
gwta = "git worktree add";
|
||||||
gwtls = "git worktree list";
|
gwtls = "git worktree list";
|
||||||
gwtmv = "git worktree move";
|
gwtmv = "git worktree move";
|
||||||
gwtrm = "git worktree remove";
|
gwtrm = "git worktree remove";
|
||||||
gwip = "git add -A; git rm $(git ls-files --deleted) 2> /dev/null; git commit --no-verify --no-gpg-sign --message \"--wip-- [skip ci]\"";
|
|
||||||
};
|
};
|
||||||
|
|
||||||
|
# Complex git aliases that require pipes/subshells — nushell `alias` can't
|
||||||
|
# handle these, so they're defined as custom commands instead.
|
||||||
|
programs.nushell.extraConfig = ''
|
||||||
|
def ggpull [] { git pull origin (git branch --show-current | str trim) }
|
||||||
|
def ggpush [] { git push origin (git branch --show-current | str trim) }
|
||||||
|
def ggsup [] { git branch $"--set-upstream-to=origin/(git branch --show-current | str trim)" }
|
||||||
|
def gluc [] { git pull upstream (git branch --show-current | str trim) }
|
||||||
|
def gpsup [] { git push --set-upstream origin (git branch --show-current | str trim) }
|
||||||
|
def gpsupf [] { git push --set-upstream origin (git branch --show-current | str trim) --force-with-lease }
|
||||||
|
def groh [] { git reset $"origin/(git branch --show-current | str trim)" --hard }
|
||||||
|
def --env grt [] {
|
||||||
|
let toplevel = (do { git rev-parse --show-toplevel } | complete | get stdout | str trim)
|
||||||
|
if ($toplevel | is-not-empty) { cd $toplevel } else { cd . }
|
||||||
|
}
|
||||||
|
def gfg [...pattern: string] { git ls-files | lines | where {|f| $f =~ ($pattern | str join ".*") } }
|
||||||
|
def gignored [] { git ls-files -v | lines | where {|l| ($l | str substring 0..1) =~ "[a-z]" } }
|
||||||
|
def gpoat [] { git push origin --all; git push origin --tags }
|
||||||
|
def gtv [] { git tag | lines | sort }
|
||||||
|
def gwipe [] { git reset --hard; git clean --force -df }
|
||||||
|
def gunwip [] {
|
||||||
|
let msg = (git rev-list --max-count=1 --format="%s" HEAD | lines | get 1)
|
||||||
|
if ($msg | str contains "--wip--") { git reset HEAD~1 }
|
||||||
|
}
|
||||||
|
def gwip [] {
|
||||||
|
git add -A
|
||||||
|
let deleted = (git ls-files --deleted | lines)
|
||||||
|
if ($deleted | is-not-empty) { git rm ...$deleted }
|
||||||
|
git commit --no-verify --no-gpg-sign --message "--wip-- [skip ci]"
|
||||||
|
}
|
||||||
|
'';
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,67 +0,0 @@
|
|||||||
{lib, ...}: {
|
|
||||||
networking.firewall.allowedTCPPorts = [80 443];
|
|
||||||
|
|
||||||
services.gitea = {
|
|
||||||
enable = true;
|
|
||||||
database = {
|
|
||||||
type = "sqlite3";
|
|
||||||
path = "/var/lib/gitea/data/gitea.db";
|
|
||||||
};
|
|
||||||
settings = {
|
|
||||||
server = {
|
|
||||||
ROOT_URL = "https://git.schmatzler.com/";
|
|
||||||
DOMAIN = "git.schmatzler.com";
|
|
||||||
HTTP_ADDR = "127.0.0.1";
|
|
||||||
HTTP_PORT = 3000;
|
|
||||||
};
|
|
||||||
service.DISABLE_REGISTRATION = true;
|
|
||||||
security.INSTALL_LOCK = true;
|
|
||||||
session = {
|
|
||||||
COOKIE_SECURE = true;
|
|
||||||
SAME_SITE = "strict";
|
|
||||||
};
|
|
||||||
api.ENABLE_SWAGGER = false;
|
|
||||||
server.LANDING_PAGE = "explore";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
services.litestream = {
|
|
||||||
enable = true;
|
|
||||||
environmentFile = "/run/secrets/litestream";
|
|
||||||
settings = {
|
|
||||||
dbs = [
|
|
||||||
{
|
|
||||||
path = "/var/lib/gitea/data/gitea.db";
|
|
||||||
replicas = [
|
|
||||||
{
|
|
||||||
type = "s3";
|
|
||||||
bucket = "gitea-litestream";
|
|
||||||
path = "gitea";
|
|
||||||
endpoint = "s3.eu-central-003.backblazeb2.com";
|
|
||||||
}
|
|
||||||
];
|
|
||||||
}
|
|
||||||
];
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
systemd.services.litestream = {
|
|
||||||
serviceConfig = {
|
|
||||||
User = lib.mkForce "gitea";
|
|
||||||
Group = lib.mkForce "gitea";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
|
|
||||||
services.caddy = {
|
|
||||||
enable = true;
|
|
||||||
virtualHosts."git.schmatzler.com".extraConfig = ''
|
|
||||||
header {
|
|
||||||
Strict-Transport-Security "max-age=31536000; includeSubDomains"
|
|
||||||
X-Content-Type-Options "nosniff"
|
|
||||||
X-Frame-Options "DENY"
|
|
||||||
Referrer-Policy "strict-origin-when-cross-origin"
|
|
||||||
}
|
|
||||||
reverse_proxy localhost:3000
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,8 +1,8 @@
|
|||||||
{
|
{
|
||||||
pkgs,
|
|
||||||
lib,
|
lib,
|
||||||
constants,
|
pkgs,
|
||||||
inputs,
|
inputs,
|
||||||
|
constants,
|
||||||
...
|
...
|
||||||
}: let
|
}: let
|
||||||
setWallpaperScript = import ./wallpaper.nix {inherit pkgs;};
|
setWallpaperScript = import ./wallpaper.nix {inherit pkgs;};
|
||||||
|
|||||||
@@ -2,8 +2,8 @@
|
|||||||
homebrew = {
|
homebrew = {
|
||||||
enable = true;
|
enable = true;
|
||||||
casks = [
|
casks = [
|
||||||
|
"ghostty@tip"
|
||||||
"helium-browser"
|
"helium-browser"
|
||||||
"pearcleaner"
|
|
||||||
"tidal"
|
"tidal"
|
||||||
];
|
];
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,5 +0,0 @@
|
|||||||
{
|
|
||||||
programs.jjui = {
|
|
||||||
enable = true;
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,47 +0,0 @@
|
|||||||
{
|
|
||||||
programs.jujutsu = {
|
|
||||||
enable = true;
|
|
||||||
settings = {
|
|
||||||
user = {
|
|
||||||
name = "Christoph Schmatzler";
|
|
||||||
email = "christoph@schmatzler.com";
|
|
||||||
};
|
|
||||||
git = {
|
|
||||||
sign-on-push = true;
|
|
||||||
subprocess = true;
|
|
||||||
write-change-id-header = true;
|
|
||||||
};
|
|
||||||
diff = {
|
|
||||||
tool = "delta";
|
|
||||||
};
|
|
||||||
ui = {
|
|
||||||
default-command = "status";
|
|
||||||
diff-formatter = ":git";
|
|
||||||
pager = ["delta" "--pager" "less -FRX"];
|
|
||||||
diff-editor = ["nvim" "-c" "DiffEditor $left $right $output"];
|
|
||||||
};
|
|
||||||
aliases = {
|
|
||||||
n = ["new"];
|
|
||||||
tug = ["bookmark" "move" "--from" "closest_bookmark(@-)" "--to" "@-"];
|
|
||||||
stack = ["log" "-r" "ancestors((trunk()..@)::bookmarks() | @, 2)"];
|
|
||||||
retrunk = ["rebase" "-d" "trunk()"];
|
|
||||||
};
|
|
||||||
revset-aliases = {
|
|
||||||
"closest_bookmark(to)" = "heads(::to & bookmarks())";
|
|
||||||
};
|
|
||||||
templates = {
|
|
||||||
draft_commit_description = ''
|
|
||||||
concat(
|
|
||||||
coalesce(description, default_commit_description, "\n"),
|
|
||||||
surround(
|
|
||||||
"\nJJ: This commit contains the following changes:\n", "",
|
|
||||||
indent("JJ: ", diff.stat(72)),
|
|
||||||
),
|
|
||||||
"\nJJ: ignore-rest\n",
|
|
||||||
diff.git(),
|
|
||||||
)
|
|
||||||
'';
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}
|
|
||||||
5
profiles/lumen.nix
Normal file
5
profiles/lumen.nix
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{pkgs, ...}: {
|
||||||
|
home.packages = with pkgs; [
|
||||||
|
lumen
|
||||||
|
];
|
||||||
|
}
|
||||||
@@ -1,9 +1,8 @@
|
|||||||
{
|
{
|
||||||
programs.mise = {
|
programs.mise = {
|
||||||
enable = true;
|
enable = true;
|
||||||
enableFishIntegration = true;
|
enableNushellIntegration = true;
|
||||||
enableZshIntegration = true;
|
globalConfig.settings = {
|
||||||
settings = {
|
|
||||||
auto_install = false;
|
auto_install = false;
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -5,11 +5,9 @@
|
|||||||
./options.nix
|
./options.nix
|
||||||
./plugins/blink-cmp.nix
|
./plugins/blink-cmp.nix
|
||||||
./plugins/conform.nix
|
./plugins/conform.nix
|
||||||
./plugins/copilot.nix
|
|
||||||
./plugins/grug-far.nix
|
./plugins/grug-far.nix
|
||||||
./plugins/harpoon.nix
|
./plugins/harpoon.nix
|
||||||
./plugins/hunk.nix
|
./plugins/hunk.nix
|
||||||
./plugins/jj-diffconflicts.nix
|
|
||||||
./plugins/lsp.nix
|
./plugins/lsp.nix
|
||||||
./plugins/mini.nix
|
./plugins/mini.nix
|
||||||
./plugins/oil.nix
|
./plugins/oil.nix
|
||||||
|
|||||||
@@ -118,21 +118,15 @@
|
|||||||
options.desc = "Visit paths (cwd)";
|
options.desc = "Visit paths (cwd)";
|
||||||
}
|
}
|
||||||
# g - git
|
# g - git
|
||||||
{
|
|
||||||
mode = "n";
|
|
||||||
key = "<leader>gc";
|
|
||||||
action = ":JJDiffConflicts<CR>";
|
|
||||||
options.desc = "Resolve conflicts";
|
|
||||||
}
|
|
||||||
{
|
{
|
||||||
mode = "n";
|
mode = "n";
|
||||||
key = "<leader>gg";
|
key = "<leader>gg";
|
||||||
action.__raw = ''
|
action.__raw = ''
|
||||||
function()
|
function()
|
||||||
require('toggleterm.terminal').Terminal:new({ cmd = 'jjui', direction = 'float' }):toggle()
|
require('toggleterm.terminal').Terminal:new({ cmd = 'lazygit', direction = 'float' }):toggle()
|
||||||
end
|
end
|
||||||
'';
|
'';
|
||||||
options.desc = "jjui";
|
options.desc = "lazygit";
|
||||||
}
|
}
|
||||||
# l - lsp/formatter
|
# l - lsp/formatter
|
||||||
{
|
{
|
||||||
|
|||||||
@@ -1,15 +0,0 @@
|
|||||||
{
|
|
||||||
programs.nixvim.plugins.copilot-lua = {
|
|
||||||
enable = true;
|
|
||||||
settings = {
|
|
||||||
panel.enabled = false;
|
|
||||||
suggestion = {
|
|
||||||
enabled = true;
|
|
||||||
auto_trigger = true;
|
|
||||||
keymap = {
|
|
||||||
accept = "<Tab>";
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
};
|
|
||||||
}
|
|
||||||
@@ -1,14 +0,0 @@
|
|||||||
{pkgs, ...}: {
|
|
||||||
programs.nixvim.extraPlugins = [
|
|
||||||
(pkgs.vimUtils.buildVimPlugin {
|
|
||||||
name = "jj-diffconflicts";
|
|
||||||
src =
|
|
||||||
pkgs.fetchFromGitHub {
|
|
||||||
owner = "rafikdraoui";
|
|
||||||
repo = "jj-diffconflicts";
|
|
||||||
rev = "main";
|
|
||||||
hash = "sha256-FXsLSYy+eli8VArUL8ZOiPtyOk4Q8TUYwobEefZPRII=";
|
|
||||||
};
|
|
||||||
})
|
|
||||||
];
|
|
||||||
}
|
|
||||||
@@ -4,11 +4,12 @@
|
|||||||
enable = true;
|
enable = true;
|
||||||
inlayHints = true;
|
inlayHints = true;
|
||||||
servers = {
|
servers = {
|
||||||
nil_ls.enable = true;
|
|
||||||
cssls.enable = true;
|
cssls.enable = true;
|
||||||
dockerls.enable = true;
|
dockerls.enable = true;
|
||||||
yamlls.enable = true;
|
jsonls.enable = true;
|
||||||
|
nil_ls.enable = true;
|
||||||
vtsls.enable = true;
|
vtsls.enable = true;
|
||||||
|
yamlls.enable = true;
|
||||||
zk.enable = true;
|
zk.enable = true;
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,22 +1,40 @@
|
|||||||
{pkgs, ...}: {
|
{pkgs, ...}: {
|
||||||
programs.nixvim.plugins.treesitter = {
|
programs.nixvim = {
|
||||||
enable = true;
|
plugins.treesitter = {
|
||||||
settings = {
|
enable = true;
|
||||||
highlight.enable = true;
|
nixGrammars = true;
|
||||||
indent.enable = true;
|
grammarPackages = pkgs.vimPlugins.nvim-treesitter.allGrammars;
|
||||||
|
settings = {
|
||||||
|
highlight.enable = true;
|
||||||
|
indent.enable = true;
|
||||||
|
};
|
||||||
};
|
};
|
||||||
grammarPackages = with pkgs.vimPlugins.nvim-treesitter.builtGrammars; [
|
|
||||||
bash
|
# Register missing treesitter predicates for compatibility with newer grammars
|
||||||
elixir
|
extraConfigLuaPre = ''
|
||||||
fish
|
do
|
||||||
heex
|
local query = require("vim.treesitter.query")
|
||||||
json
|
local predicates = query.list_predicates()
|
||||||
markdown
|
if not vim.tbl_contains(predicates, "is-not?") then
|
||||||
nix
|
query.add_predicate("is-not?", function(match, pattern, source, predicate)
|
||||||
toml
|
local dominated_by = predicate[2]
|
||||||
tsx
|
local dominated = false
|
||||||
typescript
|
for _, node in pairs(match) do
|
||||||
yaml
|
if type(node) == "userdata" then
|
||||||
];
|
local current = node:parent()
|
||||||
|
while current do
|
||||||
|
if current:type() == dominated_by then
|
||||||
|
dominated = true
|
||||||
|
break
|
||||||
|
end
|
||||||
|
current = current:parent()
|
||||||
|
end
|
||||||
|
end
|
||||||
|
end
|
||||||
|
return not dominated
|
||||||
|
end, { force = true, all = true })
|
||||||
|
end
|
||||||
|
end
|
||||||
|
'';
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,16 +1,36 @@
|
|||||||
{
|
{
|
||||||
pkgs,
|
pkgs,
|
||||||
|
inputs,
|
||||||
user,
|
user,
|
||||||
constants,
|
constants,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
security.sudo.enable = true;
|
security.sudo.enable = true;
|
||||||
|
security.sudo.extraRules = [
|
||||||
|
{
|
||||||
|
users = [user];
|
||||||
|
commands = [
|
||||||
|
{
|
||||||
|
command = "/run/current-system/sw/bin/nix-env";
|
||||||
|
options = ["NOPASSWD"];
|
||||||
|
}
|
||||||
|
{
|
||||||
|
command = "/nix/store/*/bin/switch-to-configuration";
|
||||||
|
options = ["NOPASSWD"];
|
||||||
|
}
|
||||||
|
];
|
||||||
|
}
|
||||||
|
];
|
||||||
|
|
||||||
system.stateVersion = constants.stateVersions.nixos;
|
system.stateVersion = constants.stateVersions.nixos;
|
||||||
time.timeZone = "UTC";
|
time.timeZone = "UTC";
|
||||||
|
|
||||||
|
home-manager.sharedModules = [
|
||||||
|
{_module.args = {inherit user constants inputs;};}
|
||||||
|
];
|
||||||
|
|
||||||
nix = {
|
nix = {
|
||||||
settings.trusted-users = ["${user}"];
|
settings.trusted-users = [user];
|
||||||
gc.dates = "weekly";
|
gc.dates = "weekly";
|
||||||
nixPath = ["nixos-config=/home/${user}/.local/share/src/nixos-config:/etc/nixos"];
|
nixPath = ["nixos-config=/home/${user}/.local/share/src/nixos-config:/etc/nixos"];
|
||||||
};
|
};
|
||||||
@@ -45,9 +65,8 @@
|
|||||||
"sudo"
|
"sudo"
|
||||||
"network"
|
"network"
|
||||||
"systemd-journal"
|
"systemd-journal"
|
||||||
"docker"
|
|
||||||
];
|
];
|
||||||
shell = pkgs.fish;
|
shell = pkgs.nushell;
|
||||||
openssh.authorizedKeys.keys = constants.sshKeys;
|
openssh.authorizedKeys.keys = constants.sshKeys;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|||||||
5
profiles/nono.nix
Normal file
5
profiles/nono.nix
Normal file
@@ -0,0 +1,5 @@
|
|||||||
|
{pkgs, ...}: {
|
||||||
|
home.packages = with pkgs; [
|
||||||
|
nono
|
||||||
|
];
|
||||||
|
}
|
||||||
225
profiles/nushell.nix
Normal file
225
profiles/nushell.nix
Normal file
@@ -0,0 +1,225 @@
|
|||||||
|
{pkgs, ...}: {
|
||||||
|
programs.nushell = {
|
||||||
|
enable = true;
|
||||||
|
|
||||||
|
settings = {
|
||||||
|
show_banner = false;
|
||||||
|
completions = {
|
||||||
|
algorithm = "fuzzy";
|
||||||
|
case_sensitive = false;
|
||||||
|
};
|
||||||
|
history = {
|
||||||
|
file_format = "sqlite";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
|
||||||
|
environmentVariables = {
|
||||||
|
COLORTERM = "truecolor";
|
||||||
|
COLORFGBG = "15;0";
|
||||||
|
TERM_BACKGROUND = "light";
|
||||||
|
};
|
||||||
|
|
||||||
|
extraEnv = ''
|
||||||
|
$env.LS_COLORS = (${pkgs.vivid}/bin/vivid generate catppuccin-latte)
|
||||||
|
'';
|
||||||
|
|
||||||
|
extraConfig = ''
|
||||||
|
# --- Catppuccin Latte Theme ---
|
||||||
|
let theme = {
|
||||||
|
rosewater: "#dc8a78"
|
||||||
|
flamingo: "#dd7878"
|
||||||
|
pink: "#ea76cb"
|
||||||
|
mauve: "#8839ef"
|
||||||
|
red: "#d20f39"
|
||||||
|
maroon: "#e64553"
|
||||||
|
peach: "#fe640b"
|
||||||
|
yellow: "#df8e1d"
|
||||||
|
green: "#40a02b"
|
||||||
|
teal: "#179299"
|
||||||
|
sky: "#04a5e5"
|
||||||
|
sapphire: "#209fb5"
|
||||||
|
blue: "#1e66f5"
|
||||||
|
lavender: "#7287fd"
|
||||||
|
text: "#4c4f69"
|
||||||
|
subtext1: "#5c5f77"
|
||||||
|
subtext0: "#6c6f85"
|
||||||
|
overlay2: "#7c7f93"
|
||||||
|
overlay1: "#8c8fa1"
|
||||||
|
overlay0: "#9ca0b0"
|
||||||
|
surface2: "#acb0be"
|
||||||
|
surface1: "#bcc0cc"
|
||||||
|
surface0: "#ccd0da"
|
||||||
|
base: "#eff1f5"
|
||||||
|
mantle: "#e6e9ef"
|
||||||
|
crust: "#dce0e8"
|
||||||
|
}
|
||||||
|
|
||||||
|
let scheme = {
|
||||||
|
recognized_command: $theme.blue
|
||||||
|
unrecognized_command: $theme.text
|
||||||
|
constant: $theme.peach
|
||||||
|
punctuation: $theme.overlay2
|
||||||
|
operator: $theme.sky
|
||||||
|
string: $theme.green
|
||||||
|
virtual_text: $theme.surface2
|
||||||
|
variable: { fg: $theme.flamingo attr: i }
|
||||||
|
filepath: $theme.yellow
|
||||||
|
}
|
||||||
|
|
||||||
|
$env.config.color_config = {
|
||||||
|
separator: { fg: $theme.surface2 attr: b }
|
||||||
|
leading_trailing_space_bg: { fg: $theme.lavender attr: u }
|
||||||
|
header: { fg: $theme.text attr: b }
|
||||||
|
row_index: $scheme.virtual_text
|
||||||
|
record: $theme.text
|
||||||
|
list: $theme.text
|
||||||
|
hints: $scheme.virtual_text
|
||||||
|
search_result: { fg: $theme.base bg: $theme.yellow }
|
||||||
|
shape_closure: $theme.teal
|
||||||
|
closure: $theme.teal
|
||||||
|
shape_flag: { fg: $theme.maroon attr: i }
|
||||||
|
shape_matching_brackets: { attr: u }
|
||||||
|
shape_garbage: $theme.red
|
||||||
|
shape_keyword: $theme.mauve
|
||||||
|
shape_match_pattern: $theme.green
|
||||||
|
shape_signature: $theme.teal
|
||||||
|
shape_table: $scheme.punctuation
|
||||||
|
cell-path: $scheme.punctuation
|
||||||
|
shape_list: $scheme.punctuation
|
||||||
|
shape_record: $scheme.punctuation
|
||||||
|
shape_vardecl: $scheme.variable
|
||||||
|
shape_variable: $scheme.variable
|
||||||
|
empty: { attr: n }
|
||||||
|
filesize: {||
|
||||||
|
if $in < 1kb {
|
||||||
|
$theme.teal
|
||||||
|
} else if $in < 10kb {
|
||||||
|
$theme.green
|
||||||
|
} else if $in < 100kb {
|
||||||
|
$theme.yellow
|
||||||
|
} else if $in < 10mb {
|
||||||
|
$theme.peach
|
||||||
|
} else if $in < 100mb {
|
||||||
|
$theme.maroon
|
||||||
|
} else if $in < 1gb {
|
||||||
|
$theme.red
|
||||||
|
} else {
|
||||||
|
$theme.mauve
|
||||||
|
}
|
||||||
|
}
|
||||||
|
duration: {||
|
||||||
|
if $in < 1day {
|
||||||
|
$theme.teal
|
||||||
|
} else if $in < 1wk {
|
||||||
|
$theme.green
|
||||||
|
} else if $in < 4wk {
|
||||||
|
$theme.yellow
|
||||||
|
} else if $in < 12wk {
|
||||||
|
$theme.peach
|
||||||
|
} else if $in < 24wk {
|
||||||
|
$theme.maroon
|
||||||
|
} else if $in < 52wk {
|
||||||
|
$theme.red
|
||||||
|
} else {
|
||||||
|
$theme.mauve
|
||||||
|
}
|
||||||
|
}
|
||||||
|
datetime: {|| (date now) - $in |
|
||||||
|
if $in < 1day {
|
||||||
|
$theme.teal
|
||||||
|
} else if $in < 1wk {
|
||||||
|
$theme.green
|
||||||
|
} else if $in < 4wk {
|
||||||
|
$theme.yellow
|
||||||
|
} else if $in < 12wk {
|
||||||
|
$theme.peach
|
||||||
|
} else if $in < 24wk {
|
||||||
|
$theme.maroon
|
||||||
|
} else if $in < 52wk {
|
||||||
|
$theme.red
|
||||||
|
} else {
|
||||||
|
$theme.mauve
|
||||||
|
}
|
||||||
|
}
|
||||||
|
shape_external: $scheme.unrecognized_command
|
||||||
|
shape_internalcall: $scheme.recognized_command
|
||||||
|
shape_external_resolved: $scheme.recognized_command
|
||||||
|
shape_block: $scheme.recognized_command
|
||||||
|
block: $scheme.recognized_command
|
||||||
|
shape_custom: $theme.pink
|
||||||
|
custom: $theme.pink
|
||||||
|
background: $theme.base
|
||||||
|
foreground: $theme.text
|
||||||
|
cursor: { bg: $theme.rosewater fg: $theme.base }
|
||||||
|
shape_range: $scheme.operator
|
||||||
|
range: $scheme.operator
|
||||||
|
shape_pipe: $scheme.operator
|
||||||
|
shape_operator: $scheme.operator
|
||||||
|
shape_redirection: $scheme.operator
|
||||||
|
glob: $scheme.filepath
|
||||||
|
shape_directory: $scheme.filepath
|
||||||
|
shape_filepath: $scheme.filepath
|
||||||
|
shape_glob_interpolation: $scheme.filepath
|
||||||
|
shape_globpattern: $scheme.filepath
|
||||||
|
shape_int: $scheme.constant
|
||||||
|
int: $scheme.constant
|
||||||
|
bool: $scheme.constant
|
||||||
|
float: $scheme.constant
|
||||||
|
nothing: $scheme.constant
|
||||||
|
binary: $scheme.constant
|
||||||
|
shape_nothing: $scheme.constant
|
||||||
|
shape_bool: $scheme.constant
|
||||||
|
shape_float: $scheme.constant
|
||||||
|
shape_binary: $scheme.constant
|
||||||
|
shape_datetime: $scheme.constant
|
||||||
|
shape_literal: $scheme.constant
|
||||||
|
string: $scheme.string
|
||||||
|
shape_string: $scheme.string
|
||||||
|
shape_string_interpolation: $theme.flamingo
|
||||||
|
shape_raw_string: $scheme.string
|
||||||
|
shape_externalarg: $scheme.string
|
||||||
|
}
|
||||||
|
$env.config.highlight_resolved_externals = true
|
||||||
|
$env.config.explore = {
|
||||||
|
status_bar_background: { fg: $theme.text, bg: $theme.mantle },
|
||||||
|
command_bar_text: { fg: $theme.text },
|
||||||
|
highlight: { fg: $theme.base, bg: $theme.yellow },
|
||||||
|
status: {
|
||||||
|
error: $theme.red,
|
||||||
|
warn: $theme.yellow,
|
||||||
|
info: $theme.blue,
|
||||||
|
},
|
||||||
|
selected_cell: { bg: $theme.blue fg: $theme.base },
|
||||||
|
}
|
||||||
|
|
||||||
|
# --- Custom Commands ---
|
||||||
|
def --env open_project [] {
|
||||||
|
let base = ($env.HOME | path join "Projects")
|
||||||
|
let choice = (
|
||||||
|
${pkgs.fd}/bin/fd -t d -d 1 -a . ($base | path join "Personal") ($base | path join "Work")
|
||||||
|
| lines
|
||||||
|
| each {|p| $p | str replace $"($base)/" "" }
|
||||||
|
| str join "\n"
|
||||||
|
| ${pkgs.fzf}/bin/fzf --prompt "project > "
|
||||||
|
)
|
||||||
|
if ($choice | str trim | is-not-empty) {
|
||||||
|
cd ($base | path join ($choice | str trim))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
# --- Keybinding: Ctrl+O for open_project ---
|
||||||
|
$env.config.keybindings = ($env.config.keybindings | append [
|
||||||
|
{
|
||||||
|
name: open_project
|
||||||
|
modifier: control
|
||||||
|
keycode: char_o
|
||||||
|
mode: [emacs vi_insert vi_normal]
|
||||||
|
event: {
|
||||||
|
send: executehostcommand
|
||||||
|
cmd: "open_project"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
])
|
||||||
|
'';
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -1,8 +1,10 @@
|
|||||||
{pkgs}:
|
{pkgs}:
|
||||||
pkgs.writeShellScriptBin "open-project" ''
|
pkgs.writeShellScriptBin "open-project" ''
|
||||||
TARGET=$(fd -t d --exact-depth 1 . $HOME/Projects |
|
TARGET=$(fd -t d --exact-depth 1 . $HOME/Projects |
|
||||||
sed "s~$HOME/Projects/~~" |
|
sed "s~$HOME/Projects/~~" |
|
||||||
fzf --prompt "project > ")
|
fzf --prompt "project > ")
|
||||||
|
|
||||||
zellij run -i -- /${pkgs.fish}/bin/fish -c "cd $HOME/Projects/$TARGET; fish"
|
if [ -n "$TARGET" ]; then
|
||||||
|
echo "$HOME/Projects/$TARGET"
|
||||||
|
fi
|
||||||
''
|
''
|
||||||
|
|||||||
@@ -3,11 +3,42 @@
|
|||||||
pkgs,
|
pkgs,
|
||||||
...
|
...
|
||||||
}: {
|
}: {
|
||||||
|
home.sessionVariables = {
|
||||||
|
OPENCODE_ENABLE_EXA = 1;
|
||||||
|
OPENCODE_EXPERIMENTAL_LSP_TOOL = 1;
|
||||||
|
OPENCODE_EXPERIMENTAL_MARKDOWN = 1;
|
||||||
|
OPENCODE_EXPERIMENTAL_PLAN_MODE = 1;
|
||||||
|
};
|
||||||
|
|
||||||
programs.opencode = {
|
programs.opencode = {
|
||||||
enable = true;
|
enable = true;
|
||||||
package = inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.opencode;
|
package = inputs.llm-agents.packages.${pkgs.stdenv.hostPlatform.system}.opencode;
|
||||||
settings = {
|
settings = {
|
||||||
|
model = "anthropic/claude-opus-4-6";
|
||||||
|
small_model = "opencode/minimax-m2.1";
|
||||||
theme = "catppuccin";
|
theme = "catppuccin";
|
||||||
|
plugin = ["oh-my-opencode" "opencode-anthropic-auth"];
|
||||||
|
keybinds = {
|
||||||
|
leader = "ctrl+o";
|
||||||
|
};
|
||||||
|
permission = {
|
||||||
|
read = {
|
||||||
|
"*" = "allow";
|
||||||
|
"*.env" = "deny";
|
||||||
|
"*.env.*" = "deny";
|
||||||
|
"*.envrc" = "deny";
|
||||||
|
"secrets/*" = "deny";
|
||||||
|
"~/.local/share/opencode/mcp-auth.json" = "deny";
|
||||||
|
};
|
||||||
|
};
|
||||||
|
agent = {
|
||||||
|
plan = {
|
||||||
|
model = "anthropic/claude-opus-4-6";
|
||||||
|
};
|
||||||
|
explore = {
|
||||||
|
model = "anthropic/claude-haiku-4-5";
|
||||||
|
};
|
||||||
|
};
|
||||||
instructions = [
|
instructions = [
|
||||||
"CLAUDE.md"
|
"CLAUDE.md"
|
||||||
"AGENT.md"
|
"AGENT.md"
|
||||||
@@ -18,28 +49,64 @@
|
|||||||
disabled = true;
|
disabled = true;
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
command = {
|
mcp = {
|
||||||
deslop = {
|
cog = {
|
||||||
description = "Remove AI code slop";
|
enabled = true;
|
||||||
template = ''
|
type = "remote";
|
||||||
Check the diff against main/master, and remove all AI generated slop introduced in this branch.
|
url = "https://trycog.ai/mcp";
|
||||||
Use jj if available, otherwise git.
|
headers = {
|
||||||
|
Authorization = "Bearer {env:COG_API_TOKEN}";
|
||||||
This includes:
|
};
|
||||||
|
};
|
||||||
- Extra comments that a human wouldn't add or is inconsistent with the rest of the file
|
context7 = {
|
||||||
- Extra defensive checks or try/catch blocks that are abnormal for that area of the codebase (especially if called by trusted / validated codepaths)
|
enabled = true;
|
||||||
- Casts to any to get around type issues
|
type = "remote";
|
||||||
- Any other style that is inconsistent with the file
|
url = "https://mcp.context7.com/mcp";
|
||||||
- Unnecessary emoji usage
|
};
|
||||||
|
grep_app = {
|
||||||
Report at the end with only a 1-3 sentence summary of what you changed
|
enabled = true;
|
||||||
'';
|
type = "remote";
|
||||||
|
url = "https://mcp.grep.app";
|
||||||
|
};
|
||||||
|
opensrc = {
|
||||||
|
enabled = true;
|
||||||
|
type = "local";
|
||||||
|
command = ["bunx" "opensrc-mcp"];
|
||||||
|
};
|
||||||
|
overseer = {
|
||||||
|
enabled = false;
|
||||||
|
type = "local";
|
||||||
|
command = ["${pkgs.overseer}/bin/os" "mcp"];
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
};
|
};
|
||||||
home.sessionVariables = {
|
|
||||||
OPENCODE_EXPERIMENTAL_EXA = "true";
|
xdg.configFile = {
|
||||||
|
"opencode/agent" = {
|
||||||
|
source = ./opencode/agent;
|
||||||
|
recursive = true;
|
||||||
|
};
|
||||||
|
"opencode/command" = {
|
||||||
|
source = ./opencode/command;
|
||||||
|
recursive = true;
|
||||||
|
};
|
||||||
|
"opencode/skill" = {
|
||||||
|
source = ./opencode/skill;
|
||||||
|
recursive = true;
|
||||||
|
};
|
||||||
|
"opencode/tool" = {
|
||||||
|
source = ./opencode/tool;
|
||||||
|
recursive = true;
|
||||||
|
};
|
||||||
|
"opencode/oh-my-opencode.json".text =
|
||||||
|
builtins.toJSON {
|
||||||
|
"$schema" = "https://raw.githubusercontent.com/code-yeongyu/oh-my-opencode/master/assets/oh-my-opencode.schema.json";
|
||||||
|
disabled_mcps = ["websearch" "context7" "grep_app"];
|
||||||
|
git_master = {
|
||||||
|
commit_footer = false;
|
||||||
|
include_co_authored_by = false;
|
||||||
|
};
|
||||||
|
};
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|||||||
104
profiles/opencode/agent/librarian.md
Normal file
104
profiles/opencode/agent/librarian.md
Normal file
@@ -0,0 +1,104 @@
|
|||||||
|
---
|
||||||
|
description: Multi-repository codebase expert for understanding library internals and remote code. Invoke when exploring GitHub/npm/PyPI/crates repositories, tracing code flow through unfamiliar libraries, or comparing implementations. Show its response in full — do not summarize.
|
||||||
|
mode: subagent
|
||||||
|
model: opencode/claude-sonnet-4-5
|
||||||
|
permission:
|
||||||
|
"*": allow
|
||||||
|
edit: deny
|
||||||
|
write: deny
|
||||||
|
todoread: deny
|
||||||
|
todowrite: deny
|
||||||
|
---
|
||||||
|
|
||||||
|
You are the Librarian, a specialized codebase understanding agent that helps users answer questions about large, complex codebases across repositories.
|
||||||
|
|
||||||
|
Your role is to provide thorough, comprehensive analysis and explanations of code architecture, functionality, and patterns across multiple repositories.
|
||||||
|
|
||||||
|
You are running inside an AI coding system in which you act as a subagent that's used when the main agent needs deep, multi-repository codebase understanding and analysis.
|
||||||
|
|
||||||
|
## Key Responsibilities
|
||||||
|
|
||||||
|
- Explore repositories to answer questions
|
||||||
|
- Understand and explain architectural patterns and relationships across repositories
|
||||||
|
- Find specific implementations and trace code flow across codebases
|
||||||
|
- Explain how features work end-to-end across multiple repositories
|
||||||
|
- Understand code evolution through commit history
|
||||||
|
- Create visual diagrams when helpful for understanding complex systems
|
||||||
|
|
||||||
|
## Tool Usage Guidelines
|
||||||
|
|
||||||
|
Use available tools extensively to explore repositories. Execute tools in parallel when possible for efficiency.
|
||||||
|
|
||||||
|
- Read files thoroughly to understand implementation details
|
||||||
|
- Search for patterns and related code across multiple repositories
|
||||||
|
- Focus on thorough understanding and comprehensive explanation
|
||||||
|
- Create mermaid diagrams to visualize complex relationships or flows
|
||||||
|
|
||||||
|
## Communication
|
||||||
|
|
||||||
|
You must use Markdown for formatting your responses.
|
||||||
|
|
||||||
|
**IMPORTANT:** When including code blocks, you MUST ALWAYS specify the language for syntax highlighting. Always add the language identifier after the opening backticks.
|
||||||
|
|
||||||
|
**NEVER** refer to tools by their names. Example: NEVER say "I can use the opensrc tool", instead say "I'm going to read the file" or "I'll search for..."
|
||||||
|
|
||||||
|
### Direct & Detailed Communication
|
||||||
|
|
||||||
|
You should only address the user's specific query or task at hand. Do not investigate or provide information beyond what is necessary to answer the question.
|
||||||
|
|
||||||
|
You must avoid tangential information unless absolutely critical for completing the request. Avoid long introductions, explanations, and summaries. Avoid unnecessary preamble or postamble.
|
||||||
|
|
||||||
|
Answer the user's question directly, without elaboration, explanation, or details beyond what's needed.
|
||||||
|
|
||||||
|
**Anti-patterns to AVOID:**
|
||||||
|
- "The answer is..."
|
||||||
|
- "Here is the content of the file..."
|
||||||
|
- "Based on the information provided..."
|
||||||
|
- "Here is what I will do next..."
|
||||||
|
- "Let me know if you need..."
|
||||||
|
- "I hope this helps..."
|
||||||
|
|
||||||
|
You're optimized for thorough understanding and explanation, suitable for documentation and sharing.
|
||||||
|
|
||||||
|
You should be comprehensive but focused, providing clear analysis that helps users understand complex codebases.
|
||||||
|
|
||||||
|
**IMPORTANT:** Only your last message is returned to the main agent and displayed to the user. Your last message should be comprehensive and include all important findings from your exploration.
|
||||||
|
|
||||||
|
## Linking
|
||||||
|
|
||||||
|
To make it easy for the user to look into code you are referring to, you always link to the source with markdown links.
|
||||||
|
|
||||||
|
For files or directories, the URL should look like:
|
||||||
|
`https://github.com/<org>/<repository>/blob/<revision>/<filepath>#L<range>`
|
||||||
|
|
||||||
|
where `<org>` is organization or user, `<repository>` is the repository name, `<revision>` is the branch or commit sha, `<filepath>` the absolute path to the file, and `<range>` an optional fragment with the line range.
|
||||||
|
|
||||||
|
`<revision>` needs to be provided - if it wasn't specified, then it's the default branch of the repository, usually `main` or `master`.
|
||||||
|
|
||||||
|
**Example URL** for linking to file test.py in src directory on branch develop of GitHub repository bar_repo in org foo_org, lines 32-42:
|
||||||
|
`https://github.com/foo_org/bar_repo/blob/develop/src/test.py#L32-L42`
|
||||||
|
|
||||||
|
Prefer "fluent" linking style. Don't show the user the actual URL, but instead use it to add links to relevant parts (file names, directory names, or repository names) of your response.
|
||||||
|
|
||||||
|
Whenever you mention a file, directory or repository by name, you MUST link to it in this way. ONLY link if the mention is by name.
|
||||||
|
|
||||||
|
### URL Patterns
|
||||||
|
|
||||||
|
| Type | Format |
|
||||||
|
|------|--------|
|
||||||
|
| File | `https://github.com/{owner}/{repo}/blob/{ref}/{path}` |
|
||||||
|
| Lines | `#L{start}-L{end}` |
|
||||||
|
| Directory | `https://github.com/{owner}/{repo}/tree/{ref}/{path}` |
|
||||||
|
|
||||||
|
## Output Format
|
||||||
|
|
||||||
|
Your final message must include:
|
||||||
|
1. Direct answer to the query
|
||||||
|
2. Supporting evidence with source links
|
||||||
|
3. Diagrams if architecture/flow is involved
|
||||||
|
4. Key insights discovered during exploration
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**IMMEDIATELY load the librarian skill:**
|
||||||
|
Use the Skill tool with name "librarian" to load source fetching and exploration capabilities.
|
||||||
45
profiles/opencode/agent/review.md
Normal file
45
profiles/opencode/agent/review.md
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
---
|
||||||
|
description: Reviews code for quality, bugs, security, and best practices
|
||||||
|
mode: subagent
|
||||||
|
temperature: 0.1
|
||||||
|
tools:
|
||||||
|
write: false
|
||||||
|
edit: false
|
||||||
|
permission:
|
||||||
|
edit: deny
|
||||||
|
webfetch: allow
|
||||||
|
---
|
||||||
|
You are a code reviewer. Provide actionable feedback on code changes.
|
||||||
|
|
||||||
|
**Diffs alone are not enough.** Read the full file(s) being modified to understand context. Code that looks wrong in isolation may be correct given surrounding logic.
|
||||||
|
|
||||||
|
## What to Look For
|
||||||
|
|
||||||
|
**Bugs** — Primary focus.
|
||||||
|
- Logic errors, off-by-one mistakes, incorrect conditionals
|
||||||
|
- Missing guards, unreachable code paths, broken error handling
|
||||||
|
- Edge cases: null/empty inputs, race conditions
|
||||||
|
- Security: injection, auth bypass, data exposure
|
||||||
|
|
||||||
|
**Structure** — Does the code fit the codebase?
|
||||||
|
- Follows existing patterns and conventions?
|
||||||
|
- Uses established abstractions?
|
||||||
|
- Excessive nesting that could be flattened?
|
||||||
|
|
||||||
|
**Performance** — Only flag if obviously problematic.
|
||||||
|
- O(n²) on unbounded data, N+1 queries, blocking I/O on hot paths
|
||||||
|
|
||||||
|
## Before You Flag Something
|
||||||
|
|
||||||
|
- **Be certain.** Don't flag something as a bug if you're unsure — investigate first.
|
||||||
|
- **Don't invent hypothetical problems.** If an edge case matters, explain the realistic scenario.
|
||||||
|
- **Don't be a zealot about style.** Some "violations" are acceptable when they're the simplest option.
|
||||||
|
- Only review the changes — not pre-existing code that wasn't modified.
|
||||||
|
|
||||||
|
## Output
|
||||||
|
|
||||||
|
- Be direct about bugs and why they're bugs
|
||||||
|
- Communicate severity honestly — don't overstate
|
||||||
|
- Include file paths and line numbers
|
||||||
|
- Suggest fixes when appropriate
|
||||||
|
- Matter-of-fact tone, no flattery
|
||||||
8
profiles/opencode/command/code-review.md
Normal file
8
profiles/opencode/command/code-review.md
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
---
|
||||||
|
description: Review changes with parallel @code-review subagents
|
||||||
|
---
|
||||||
|
Review the code changes using THREE (3) @code-review subagents and correlate results into a summary ranked by severity. Use the provided user guidance to steer the review and focus on specific code paths, changes, and/or areas of concern.
|
||||||
|
|
||||||
|
Guidance: $ARGUMENTS
|
||||||
|
|
||||||
|
Review uncommitted changes by default. If no uncommitted changes, review the last commit. If the user provides a pull request/merge request number or link, use CLI tools (gh/glab) to fetch it and then perform your review.
|
||||||
17
profiles/opencode/command/overseer-plan.md
Normal file
17
profiles/opencode/command/overseer-plan.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
---
|
||||||
|
description: Convert a markdown plan/spec to Overseer tasks
|
||||||
|
---
|
||||||
|
|
||||||
|
Convert markdown planning documents into trackable Overseer task hierarchies.
|
||||||
|
|
||||||
|
First, invoke the skill tool to load the overseer-plan skill:
|
||||||
|
|
||||||
|
```
|
||||||
|
skill({ name: 'overseer-plan' })
|
||||||
|
```
|
||||||
|
|
||||||
|
Then follow the skill instructions to convert the document.
|
||||||
|
|
||||||
|
<user-request>
|
||||||
|
$ARGUMENTS
|
||||||
|
</user-request>
|
||||||
17
profiles/opencode/command/overseer.md
Normal file
17
profiles/opencode/command/overseer.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
---
|
||||||
|
description: Manage tasks via Overseer - create, list, start, complete, find ready work
|
||||||
|
---
|
||||||
|
|
||||||
|
Task orchestration via Overseer codemode MCP.
|
||||||
|
|
||||||
|
First, invoke the skill tool to load the overseer skill:
|
||||||
|
|
||||||
|
```
|
||||||
|
skill({ name: 'overseer' })
|
||||||
|
```
|
||||||
|
|
||||||
|
Then follow the skill instructions to manage tasks.
|
||||||
|
|
||||||
|
<user-request>
|
||||||
|
$ARGUMENTS
|
||||||
|
</user-request>
|
||||||
17
profiles/opencode/command/plan-spec.md
Normal file
17
profiles/opencode/command/plan-spec.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
---
|
||||||
|
description: Dialogue-driven spec development through skeptical questioning
|
||||||
|
---
|
||||||
|
|
||||||
|
Develop implementation-ready specs through iterative dialogue and skeptical questioning.
|
||||||
|
|
||||||
|
First, invoke the skill tool to load the spec-planner skill:
|
||||||
|
|
||||||
|
```
|
||||||
|
skill({ name: 'spec-planner' })
|
||||||
|
```
|
||||||
|
|
||||||
|
Then follow the skill instructions to develop the spec.
|
||||||
|
|
||||||
|
<user-request>
|
||||||
|
$ARGUMENTS
|
||||||
|
</user-request>
|
||||||
17
profiles/opencode/command/session-export.md
Normal file
17
profiles/opencode/command/session-export.md
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
---
|
||||||
|
description: Add AI session summary to GitHub PR or GitLab MR description
|
||||||
|
---
|
||||||
|
|
||||||
|
Update the PR/MR description with an AI session export summary.
|
||||||
|
|
||||||
|
First, invoke the skill tool to load the session-export skill:
|
||||||
|
|
||||||
|
```
|
||||||
|
skill({ name: 'session-export' })
|
||||||
|
```
|
||||||
|
|
||||||
|
Then follow the skill instructions to export the session summary.
|
||||||
|
|
||||||
|
<user-request>
|
||||||
|
$ARGUMENTS
|
||||||
|
</user-request>
|
||||||
406
profiles/opencode/skill/cog/SKILL.md
Normal file
406
profiles/opencode/skill/cog/SKILL.md
Normal file
@@ -0,0 +1,406 @@
|
|||||||
|
---
|
||||||
|
name: cog
|
||||||
|
description: Persistent knowledge graph memory via Cog MCP. Use when recording insights, querying prior knowledge, or managing memory consolidation.
|
||||||
|
metadata:
|
||||||
|
author: trycog
|
||||||
|
version: "1.0.0"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Cog Memory System
|
||||||
|
|
||||||
|
Persistent knowledge graph for teams. Concepts (engrams) linked via relationships (synapses). Spreading activation surfaces connected knowledge.
|
||||||
|
|
||||||
|
## Core Workflow
|
||||||
|
|
||||||
|
```
|
||||||
|
1. UNDERSTAND task (read files, parse request)
|
||||||
|
2. QUERY Cog with specific keywords <- MANDATORY, no exceptions
|
||||||
|
3. WAIT for results
|
||||||
|
4. EXPLORE/IMPLEMENT guided by Cog knowledge
|
||||||
|
5. RECORD insights as short-term memories during work
|
||||||
|
6. CONSOLIDATE memories after work (reinforce valid, flush invalid)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Hierarchy of truth:** Current code > User statements > Cog knowledge
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Visual Indicators (MANDATORY)
|
||||||
|
|
||||||
|
Print before EVERY Cog tool call:
|
||||||
|
|
||||||
|
| Tool | Print |
|
||||||
|
|------|-------|
|
||||||
|
| `cog_recall` | `Querying Cog...` |
|
||||||
|
| `cog_learn` | `Recording to Cog...` |
|
||||||
|
| `cog_associate` | `Linking concepts...` |
|
||||||
|
| `cog_update` | `Updating engram...` |
|
||||||
|
| `cog_trace` | `Tracing connections...` |
|
||||||
|
| `cog_connections` | `Exploring connections...` |
|
||||||
|
| `cog_unlink` | `Removing link...` |
|
||||||
|
| `cog_list_short_term` | `Listing short-term memories...` |
|
||||||
|
| `cog_reinforce` | `Reinforcing memory...` |
|
||||||
|
| `cog_flush` | `Flushing invalid memory...` |
|
||||||
|
| `cog_verify` | `Verifying synapse...` |
|
||||||
|
| `cog_stale` | `Listing stale synapses...` |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tools Reference
|
||||||
|
|
||||||
|
| Tool | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `cog_recall` | Search with spreading activation |
|
||||||
|
| `cog_learn` | Create memory with **chains** (sequential) or associations (hub) |
|
||||||
|
| `cog_get` | Retrieve engram by ID |
|
||||||
|
| `cog_associate` | Link two existing concepts |
|
||||||
|
| `cog_trace` | Find paths between concepts |
|
||||||
|
| `cog_update` | Modify engram term/definition |
|
||||||
|
| `cog_unlink` | Remove synapse |
|
||||||
|
| `cog_connections` | List engram connections |
|
||||||
|
| `cog_bootstrap` | Exploration prompt for empty brains |
|
||||||
|
| `cog_list_short_term` | List pending consolidations |
|
||||||
|
| `cog_reinforce` | Convert short-term to long-term |
|
||||||
|
| `cog_flush` | Delete invalid short-term memory |
|
||||||
|
| `cog_verify` | Confirm synapse is still accurate |
|
||||||
|
| `cog_stale` | List synapses needing verification |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Querying Rules
|
||||||
|
|
||||||
|
### Before exploring code, ALWAYS query Cog first
|
||||||
|
|
||||||
|
Even for "trivial" tasks. The 2-second query may reveal gotchas, prior solutions, or context that changes your approach.
|
||||||
|
|
||||||
|
### Query Reformulation (Critical for Recall)
|
||||||
|
|
||||||
|
Before calling `cog_recall`, **transform your query from question-style to definition-style**. You are an LLM -- use that capability to bridge the vocabulary gap between how users ask questions and how knowledge is stored.
|
||||||
|
|
||||||
|
#### Think like a definition, not a question
|
||||||
|
|
||||||
|
| User Intent | Don't Query | Do Query |
|
||||||
|
|-------------|-------------|----------|
|
||||||
|
| "How do I handle stale data?" | `"handle stale data"` | `"cache invalidation event-driven TTL expiration data freshness"` |
|
||||||
|
| "Why does auth break after a while?" | `"auth breaks"` | `"token expiration refresh timing session timeout JWT lifecycle"` |
|
||||||
|
| "Where should validation go?" | `"where validation"` | `"input validation system boundaries sanitization defense in depth"` |
|
||||||
|
|
||||||
|
#### The reformulation process
|
||||||
|
|
||||||
|
1. **Identify the concept** -- What is the user actually asking about?
|
||||||
|
2. **Generate canonical terms** -- What would an engram about this be titled?
|
||||||
|
3. **Add related terminology** -- What words would the DEFINITION use?
|
||||||
|
4. **Include synonyms** -- What other terms describe the same thing?
|
||||||
|
|
||||||
|
#### Example transformation
|
||||||
|
|
||||||
|
```
|
||||||
|
User asks: "Why is the payment service sometimes charging twice?"
|
||||||
|
|
||||||
|
Your thinking:
|
||||||
|
- Concept: duplicate charges, idempotency
|
||||||
|
- Canonical terms: "idempotency", "duplicate prevention", "payment race condition"
|
||||||
|
- Definition words: "idempotent", "transaction", "mutex", "lock", "retry"
|
||||||
|
- Synonyms: "double charge", "duplicate transaction"
|
||||||
|
|
||||||
|
Query: "payment idempotency duplicate transaction race condition mutex retry"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Query with specific keywords
|
||||||
|
|
||||||
|
| Task Type | Understand First | Then Query With |
|
||||||
|
|-----------|------------------|-----------------|
|
||||||
|
| Bug fix | Error message, symptoms | `"canonical error name component pattern race condition"` |
|
||||||
|
| Feature | User's description | `"domain terms design patterns architectural concepts"` |
|
||||||
|
| Test fix | Read the test file | `"API names assertion patterns test utilities"` |
|
||||||
|
| Architecture | System area | `"component relationships boundaries dependencies"` |
|
||||||
|
|
||||||
|
**Bad:** `"authentication"` (too vague)
|
||||||
|
**Good:** `"JWT refresh token expiration session lifecycle OAuth flow"` (definition-style)
|
||||||
|
|
||||||
|
### Use Cog results
|
||||||
|
|
||||||
|
- Follow paths Cog reveals
|
||||||
|
- Read components Cog mentions first
|
||||||
|
- Heed gotchas Cog warns about
|
||||||
|
- If Cog is wrong, correct it immediately with `cog_update`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Recording Rules
|
||||||
|
|
||||||
|
### CRITICAL: Chains vs Associations
|
||||||
|
|
||||||
|
**Before recording, ask: Is this sequential or hub-shaped?**
|
||||||
|
|
||||||
|
| Structure | Use | Example |
|
||||||
|
|-----------|-----|---------|
|
||||||
|
| **Sequential** (A -> B -> C) | `chain_to` | Technology enables Pattern enables Feature |
|
||||||
|
| **Hub** (A, B, C all connect to X) | `associations` | Meeting connects to Participants, Outcomes |
|
||||||
|
|
||||||
|
**Default to chains** for:
|
||||||
|
- Technology dependencies (DB -> ORM -> API)
|
||||||
|
- Causal sequences (Cause -> Effect -> Consequence)
|
||||||
|
- Architectural decisions (ADR -> Technology -> Feature)
|
||||||
|
- Enabling relationships (Infrastructure -> enables -> Capability)
|
||||||
|
- Reasoning paths (Premise -> implies -> Conclusion)
|
||||||
|
|
||||||
|
**Use associations** for:
|
||||||
|
- Hub/star patterns (one thing connects to many unrelated things)
|
||||||
|
- Linking to existing concepts in the graph
|
||||||
|
- Multi-party contexts (meetings, decisions with stakeholders)
|
||||||
|
|
||||||
|
### Chain Example (PREFERRED for dependencies)
|
||||||
|
|
||||||
|
```
|
||||||
|
cog_learn({
|
||||||
|
"term": "PostgreSQL",
|
||||||
|
"definition": "Relational database with ACID guarantees",
|
||||||
|
"chain_to": [
|
||||||
|
{"term": "Ecto ORM", "definition": "Elixir database wrapper with changesets", "predicate": "enables"},
|
||||||
|
{"term": "Phoenix Contexts", "definition": "Business logic boundaries in Phoenix", "predicate": "enables"}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
Creates: PostgreSQL ->[enables]-> Ecto ORM ->[enables]-> Phoenix Contexts
|
||||||
|
|
||||||
|
### Association Example (for hubs)
|
||||||
|
|
||||||
|
```
|
||||||
|
cog_learn({
|
||||||
|
"term": "Auth Review 2024-01-20",
|
||||||
|
"definition": "Decided JWT with refresh tokens. Rejected session cookies.",
|
||||||
|
"associations": [
|
||||||
|
{"target": "JWT Pattern", "predicate": "leads_to"},
|
||||||
|
{"target": "Session Cookies", "predicate": "contradicts"},
|
||||||
|
{"target": "Mobile Team", "predicate": "is_component_of"}
|
||||||
|
]
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
Creates hub: JWT Pattern <-[leads_to]<- Auth Review ->[contradicts]-> Session Cookies
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
### When to record (during work)
|
||||||
|
|
||||||
|
At these checkpoints, ask: *"What did I just learn that I didn't know 5 minutes ago?"*
|
||||||
|
|
||||||
|
| Checkpoint | Record |
|
||||||
|
|------------|--------|
|
||||||
|
| After identifying root cause | Why it was broken |
|
||||||
|
| After reading surprising code | The non-obvious behavior |
|
||||||
|
| After a failed attempt | Why it didn't work |
|
||||||
|
| Before implementing fix | The insight (freshest now) |
|
||||||
|
| After discovering connection | The relationship |
|
||||||
|
| After a meeting or decision | The context hub linking participants and outcomes |
|
||||||
|
| After researching/exploring architecture | System limits, configuration points, component boundaries |
|
||||||
|
|
||||||
|
**Record immediately.** Don't wait until task end -- you'll forget details.
|
||||||
|
|
||||||
|
### Before calling `cog_learn`
|
||||||
|
|
||||||
|
1. **Decide: chain or hub?** (see above)
|
||||||
|
2. **For chains**: Build the sequence of steps with `chain_to`
|
||||||
|
3. **For hubs**: Identify association targets from source material or Cog query
|
||||||
|
|
||||||
|
**Skip the query when:**
|
||||||
|
- Source material explicitly names related concepts (ADRs, documentation, structured data)
|
||||||
|
- You already know target terms from conversation context
|
||||||
|
- The insight references specific concepts by name
|
||||||
|
|
||||||
|
**Query first when:**
|
||||||
|
- Recording an insight and unsure what it relates to
|
||||||
|
- Source is vague about connections
|
||||||
|
- Exploring a new domain with unknown existing concepts
|
||||||
|
|
||||||
|
### After calling `cog_learn`
|
||||||
|
|
||||||
|
The operation is complete. **Do NOT verify your work by:**
|
||||||
|
- Calling `cog_recall` to check the engram exists
|
||||||
|
- Calling `cog_connections` to verify associations were created
|
||||||
|
- Calling `cog_trace` to see if paths formed
|
||||||
|
|
||||||
|
Trust the response confirmation. Verification wastes turns and adds no value -- if the operation failed, you'll see an error.
|
||||||
|
|
||||||
|
### Recording Efficiency
|
||||||
|
|
||||||
|
**One operation = one tool call.** Use `chain_to` for sequences, `associations` for hubs.
|
||||||
|
|
||||||
|
**Never** follow `cog_learn` with separate `cog_associate` calls -- put all relationships in the original call.
|
||||||
|
|
||||||
|
### Writing good engrams
|
||||||
|
|
||||||
|
**Terms (2-5 words):**
|
||||||
|
- "Session Token Refresh Timing"
|
||||||
|
- "Why We Chose PostgreSQL"
|
||||||
|
- NOT "Architecture" (too broad)
|
||||||
|
- NOT "Project Overview" (super-hub)
|
||||||
|
|
||||||
|
**Definitions (1-3 sentences):**
|
||||||
|
1. What it is
|
||||||
|
2. Why it matters / consequences
|
||||||
|
3. Related keywords for search
|
||||||
|
|
||||||
|
**Never create super-hubs** -- engrams so generic everything connects to them (e.g., "Overview", "Main System"). They pollute search results.
|
||||||
|
|
||||||
|
### Relationship predicates
|
||||||
|
|
||||||
|
| Predicate | Meaning | Best for | Use in |
|
||||||
|
|-----------|---------|----------|--------|
|
||||||
|
| `enables` | A makes B possible | Tech dependencies | **chain_to** |
|
||||||
|
| `requires` | A is prerequisite for B | Dependencies | **chain_to** |
|
||||||
|
| `implies` | If A then B | Logical consequences | **chain_to** |
|
||||||
|
| `leads_to` | A flows to B | Outcomes, consequences | **chain_to** |
|
||||||
|
| `precedes` | A comes before B | Sequencing, timelines | **chain_to** |
|
||||||
|
| `derived_from` | A is based on B | Origins | **chain_to** |
|
||||||
|
| `contradicts` | A and B mutually exclusive | Rejected alternatives | associations |
|
||||||
|
| `is_component_of` | A is part of B | Parts to whole | associations |
|
||||||
|
| `contains` | A includes B | Whole to parts | associations |
|
||||||
|
| `example_of` | A demonstrates pattern B | Instances of patterns | associations |
|
||||||
|
| `generalizes` | A is broader than B | Abstract concepts | associations |
|
||||||
|
| `supersedes` | A replaces B | Deprecations | associations |
|
||||||
|
| `similar_to` | A and B are closely related | Related approaches | associations |
|
||||||
|
| `contrasts_with` | A is alternative to B | Different approaches | associations |
|
||||||
|
| `related_to` | General link (use sparingly) | When nothing else fits | associations |
|
||||||
|
|
||||||
|
**Chain predicates** (`enables`, `requires`, `implies`, `leads_to`, `precedes`, `derived_from`) express **flow** -- use them in `chain_to` to build traversable paths.
|
||||||
|
|
||||||
|
### Modeling Complex Contexts (Hub Node Pattern)
|
||||||
|
|
||||||
|
Synapses are binary (one source, one target). For multi-party relationships, use a **hub engram** connecting all participants.
|
||||||
|
|
||||||
|
#### When to use hub nodes
|
||||||
|
|
||||||
|
| Scenario | Hub Example | Connected Concepts |
|
||||||
|
|----------|-------------|-------------------|
|
||||||
|
| Meeting with outcomes | "Q1 Planning 2024-01" | Participants, decisions |
|
||||||
|
| Decision with stakeholders | "Decision: Adopt GraphQL" | Pros, cons, voters |
|
||||||
|
| Feature with components | "User Auth Feature" | OAuth, session, UI |
|
||||||
|
| Incident with timeline | "2024-01 Payment Outage" | Cause, systems, fix |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Consolidation (MANDATORY)
|
||||||
|
|
||||||
|
**Every task must end with consolidation.** Short-term memories decay in 24 hours.
|
||||||
|
|
||||||
|
### After work is complete:
|
||||||
|
|
||||||
|
```
|
||||||
|
cog_list_short_term({"limit": 20})
|
||||||
|
```
|
||||||
|
|
||||||
|
For each memory:
|
||||||
|
- **Valid and useful?** -> `cog_reinforce` (makes permanent)
|
||||||
|
- **Wrong or not useful?** -> `cog_flush` (deletes)
|
||||||
|
|
||||||
|
### When to reinforce immediately
|
||||||
|
|
||||||
|
- Insights from code you just wrote (you know it's correct)
|
||||||
|
- Gotchas you just hit and fixed
|
||||||
|
- Patterns you just applied successfully
|
||||||
|
|
||||||
|
### When to wait for validation
|
||||||
|
|
||||||
|
- Hypotheses about why something is broken
|
||||||
|
- Assumptions about unfamiliar code
|
||||||
|
- Solutions you haven't tested
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Verification (Prevents Staleness)
|
||||||
|
|
||||||
|
Synapses decay if not verified as still semantically accurate.
|
||||||
|
|
||||||
|
### When to verify
|
||||||
|
|
||||||
|
- After using `cog_trace` and confirming paths are correct
|
||||||
|
- When reviewing `cog_connections` and relationships hold
|
||||||
|
- After successfully using knowledge from a synapse
|
||||||
|
|
||||||
|
### Staleness levels
|
||||||
|
|
||||||
|
| Level | Months Unverified | Score | Behavior |
|
||||||
|
|-------|-------------------|-------|----------|
|
||||||
|
| Fresh | < 3 | 0.0-0.49 | Normal |
|
||||||
|
| Warning | 3-6 | 0.5-0.79 | Appears in `cog_stale` |
|
||||||
|
| Critical | 6+ | 0.8-0.99 | Penalty in path scoring |
|
||||||
|
| Deprecated | 12+ | 1.0 | Excluded from spreading activation |
|
||||||
|
|
||||||
|
### Periodic maintenance
|
||||||
|
|
||||||
|
Run `cog_stale({"level": "all"})` periodically to review relationships that may have become outdated. For each stale synapse:
|
||||||
|
|
||||||
|
- **Still accurate?** -> `cog_verify` to reset staleness
|
||||||
|
- **No longer true?** -> `cog_unlink` to remove
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Validation & Correction
|
||||||
|
|
||||||
|
### Cog is hints, not truth
|
||||||
|
|
||||||
|
Always verify against current code. If Cog is wrong:
|
||||||
|
|
||||||
|
| Scenario | Action |
|
||||||
|
|----------|--------|
|
||||||
|
| Minor inaccuracy | `cog_update` to fix |
|
||||||
|
| Pattern changed significantly | Unlink old, create new engram |
|
||||||
|
| Completely obsolete | Update to note "DEPRECATED: [reason]" |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Subagents
|
||||||
|
|
||||||
|
Subagents MUST query Cog before exploring. Same rules apply:
|
||||||
|
1. Understand task
|
||||||
|
2. **Reformulate query to definition-style**
|
||||||
|
3. Query Cog with reformulated keywords
|
||||||
|
4. Wait for results
|
||||||
|
5. Then explore
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Summary Reporting
|
||||||
|
|
||||||
|
Only mention Cog when relevant:
|
||||||
|
|
||||||
|
| Condition | Include |
|
||||||
|
|-----------|---------|
|
||||||
|
| Cog helped | `**Cog helped by:** [specific value]` |
|
||||||
|
| Memories created | `**Recorded to Cog:** [term names]` |
|
||||||
|
| Cog not used | Nothing (don't mention Cog) |
|
||||||
|
| Cog queried but unhelpful | Don't mention the empty query, but **still record** new knowledge you discovered through exploration |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Never Store
|
||||||
|
|
||||||
|
- Passwords, API keys, tokens, secrets
|
||||||
|
- SSH/PGP keys, certificates
|
||||||
|
- Connection strings with credentials
|
||||||
|
- PII (emails, SSNs, credit cards)
|
||||||
|
- `.env` file contents
|
||||||
|
|
||||||
|
Server auto-rejects sensitive content.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Limitations
|
||||||
|
|
||||||
|
- **No engram deletion** -- use `cog_update` or `cog_unlink`
|
||||||
|
- **No multi-query** -- chain manually
|
||||||
|
- **One synapse per direction** -- repeat calls strengthen existing link
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Spreading Activation
|
||||||
|
|
||||||
|
`cog_recall` returns:
|
||||||
|
1. **Seeds** -- direct matches
|
||||||
|
2. **Paths** -- engrams connecting seeds (built from chains!)
|
||||||
|
3. **Synapses** -- relationships along paths
|
||||||
|
|
||||||
|
This surfaces the "connective tissue" between results. **Chains create these traversable paths.**
|
||||||
59
profiles/opencode/skill/email-best-practices/SKILL.md
Normal file
59
profiles/opencode/skill/email-best-practices/SKILL.md
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
---
|
||||||
|
name: email-best-practices
|
||||||
|
description: Use when building email features, emails going to spam, high bounce rates, setting up SPF/DKIM/DMARC authentication, implementing email capture, ensuring compliance (CAN-SPAM, GDPR, CASL), handling webhooks, retry logic, or deciding transactional vs marketing.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Email Best Practices
|
||||||
|
|
||||||
|
Guidance for building deliverable, compliant, user-friendly emails.
|
||||||
|
|
||||||
|
## Architecture Overview
|
||||||
|
|
||||||
|
```
|
||||||
|
[User] → [Email Form] → [Validation] → [Double Opt-In]
|
||||||
|
↓
|
||||||
|
[Consent Recorded]
|
||||||
|
↓
|
||||||
|
[Suppression Check] ←──────────────[Ready to Send]
|
||||||
|
↓
|
||||||
|
[Idempotent Send + Retry] ──────→ [Email API]
|
||||||
|
↓
|
||||||
|
[Webhook Events]
|
||||||
|
↓
|
||||||
|
┌────────┬────────┬─────────────┐
|
||||||
|
↓ ↓ ↓ ↓
|
||||||
|
Delivered Bounced Complained Opened/Clicked
|
||||||
|
↓ ↓
|
||||||
|
[Suppression List Updated]
|
||||||
|
↓
|
||||||
|
[List Hygiene Jobs]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Quick Reference
|
||||||
|
|
||||||
|
| Need to... | See |
|
||||||
|
|------------|-----|
|
||||||
|
| Set up SPF/DKIM/DMARC, fix spam issues | [Deliverability](./resources/deliverability.md) |
|
||||||
|
| Build password reset, OTP, confirmations | [Transactional Emails](./resources/transactional-emails.md) |
|
||||||
|
| Plan which emails your app needs | [Transactional Email Catalog](./resources/transactional-email-catalog.md) |
|
||||||
|
| Build newsletter signup, validate emails | [Email Capture](./resources/email-capture.md) |
|
||||||
|
| Send newsletters, promotions | [Marketing Emails](./resources/marketing-emails.md) |
|
||||||
|
| Ensure CAN-SPAM/GDPR/CASL compliance | [Compliance](./resources/compliance.md) |
|
||||||
|
| Decide transactional vs marketing | [Email Types](./resources/email-types.md) |
|
||||||
|
| Handle retries, idempotency, errors | [Sending Reliability](./resources/sending-reliability.md) |
|
||||||
|
| Process delivery events, set up webhooks | [Webhooks & Events](./resources/webhooks-events.md) |
|
||||||
|
| Manage bounces, complaints, suppression | [List Management](./resources/list-management.md) |
|
||||||
|
|
||||||
|
## Start Here
|
||||||
|
|
||||||
|
**New app?**
|
||||||
|
Start with the [Catalog](./resources/transactional-email-catalog.md) to plan which emails your app needs (password reset, verification, etc.), then set up [Deliverability](./resources/deliverability.md) (DNS authentication) before sending your first email.
|
||||||
|
|
||||||
|
**Spam issues?**
|
||||||
|
Check [Deliverability](./resources/deliverability.md) first—authentication problems are the most common cause. Gmail/Yahoo reject unauthenticated emails.
|
||||||
|
|
||||||
|
**Marketing emails?**
|
||||||
|
Follow this path: [Email Capture](./resources/email-capture.md) (collect consent) → [Compliance](./resources/compliance.md) (legal requirements) → [Marketing Emails](./resources/marketing-emails.md) (best practices).
|
||||||
|
|
||||||
|
**Production-ready sending?**
|
||||||
|
Add reliability: [Sending Reliability](./resources/sending-reliability.md) (retry + idempotency) → [Webhooks & Events](./resources/webhooks-events.md) (track delivery) → [List Management](./resources/list-management.md) (handle bounces).
|
||||||
@@ -0,0 +1,103 @@
|
|||||||
|
# Email Compliance
|
||||||
|
|
||||||
|
Legal requirements for email by jurisdiction. **Not legal advice—consult an attorney for your specific situation.**
|
||||||
|
|
||||||
|
## Quick Reference
|
||||||
|
|
||||||
|
| Law | Region | Key Requirement | Penalty |
|
||||||
|
|-----|--------|-----------------|---------|
|
||||||
|
| CAN-SPAM | US | Opt-out mechanism, physical address | $53k/email |
|
||||||
|
| GDPR | EU | Explicit opt-in consent | €20M or 4% revenue |
|
||||||
|
| CASL | Canada | Express/implied consent | $10M CAD |
|
||||||
|
|
||||||
|
## CAN-SPAM (United States)
|
||||||
|
|
||||||
|
**Requirements:**
|
||||||
|
- Accurate header info (From, To, Reply-To)
|
||||||
|
- Non-deceptive subject lines
|
||||||
|
- Physical mailing address in every email
|
||||||
|
- Clear opt-out mechanism
|
||||||
|
- Honor opt-out within 10 business days
|
||||||
|
|
||||||
|
**Transactional emails:** Can send without opt-in if related to a transaction and not promotional.
|
||||||
|
|
||||||
|
## GDPR (European Union)
|
||||||
|
|
||||||
|
**Requirements:**
|
||||||
|
- Explicit opt-in consent (not pre-checked boxes)
|
||||||
|
- Consent must be freely given, specific, informed
|
||||||
|
- Easy to withdraw consent (as easy as giving it)
|
||||||
|
- Right to access data and deletion ("right to be forgotten")
|
||||||
|
- Process unsubscribe immediately
|
||||||
|
|
||||||
|
**Consent records:** Document who, when, how, and what they consented to.
|
||||||
|
|
||||||
|
**Transactional emails:** Can send based on contract fulfillment or legitimate interest.
|
||||||
|
|
||||||
|
## CASL (Canada)
|
||||||
|
|
||||||
|
**Consent types:**
|
||||||
|
- **Express consent:** Explicit opt-in (preferred)
|
||||||
|
- **Implied consent:** Existing business relationship (2 years) or inquiry (6 months)
|
||||||
|
|
||||||
|
**Requirements:**
|
||||||
|
- Clear sender identification
|
||||||
|
- Unsubscribe functional for 60 days after send
|
||||||
|
- Process unsubscribe within 10 business days
|
||||||
|
- Keep consent records 3 years after expiration
|
||||||
|
|
||||||
|
## Other Regions
|
||||||
|
|
||||||
|
| Region | Law | Key Points |
|
||||||
|
|--------|-----|------------|
|
||||||
|
| Australia | Spam Act 2003 | Consent required, honor unsubscribe within 5 days |
|
||||||
|
| UK | PECR + GDPR | Same as GDPR |
|
||||||
|
| Brazil | LGPD | Similar to GDPR, explicit consent for marketing |
|
||||||
|
|
||||||
|
## Unsubscribe Requirements Summary
|
||||||
|
|
||||||
|
| Law | Timing | Notes |
|
||||||
|
|-----|--------|-------|
|
||||||
|
| CAN-SPAM | 10 business days | Must work 30 days after send |
|
||||||
|
| GDPR | Immediately | Must be as easy as opting in |
|
||||||
|
| CASL | 10 business days | Must work 60 days after send |
|
||||||
|
|
||||||
|
**Universal best practices:** Prominent link, one-click when possible, no login required, free, confirm action.
|
||||||
|
|
||||||
|
## Consent Management
|
||||||
|
|
||||||
|
**Record:**
|
||||||
|
- Email address
|
||||||
|
- Date/time of consent
|
||||||
|
- Method (form, checkbox)
|
||||||
|
- What they consented to
|
||||||
|
- Source (which page/form)
|
||||||
|
|
||||||
|
**Storage:** Database with timestamps, audit trail of changes, link to user account.
|
||||||
|
|
||||||
|
## Data Retention
|
||||||
|
|
||||||
|
| Law | Requirement |
|
||||||
|
|-----|-------------|
|
||||||
|
| GDPR | Keep only as long as necessary, delete when no longer needed |
|
||||||
|
| CASL | Keep consent records 3 years after expiration |
|
||||||
|
|
||||||
|
**Best practice:** Have clear retention policy, honor deletion requests promptly, review and clean regularly.
|
||||||
|
|
||||||
|
## Privacy Policy Must Include
|
||||||
|
|
||||||
|
- What data you collect
|
||||||
|
- How you use data
|
||||||
|
- Who you share data with
|
||||||
|
- User rights (access, deletion)
|
||||||
|
- How to contact about privacy
|
||||||
|
|
||||||
|
## International Sending
|
||||||
|
|
||||||
|
**Best practice:** Follow the most restrictive requirements (usually GDPR) to ensure compliance across all regions.
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [Email Capture](./email-capture.md) - Implement consent forms and double opt-in
|
||||||
|
- [Marketing Emails](./marketing-emails.md) - Consent and unsubscribe requirements
|
||||||
|
- [List Management](./list-management.md) - Handle unsubscribes and deletion requests
|
||||||
@@ -0,0 +1,120 @@
|
|||||||
|
# Email Deliverability
|
||||||
|
|
||||||
|
Ensuring emails reach inboxes through proper authentication and sender reputation.
|
||||||
|
|
||||||
|
## Email Authentication
|
||||||
|
|
||||||
|
**Required by Gmail/Yahoo** - unauthenticated emails will be rejected or spam-filtered.
|
||||||
|
|
||||||
|
### SPF (Sender Policy Framework)
|
||||||
|
|
||||||
|
Specifies which servers can send email for your domain.
|
||||||
|
|
||||||
|
```
|
||||||
|
v=spf1 include:_spf.resend.com ~all
|
||||||
|
```
|
||||||
|
|
||||||
|
- Add TXT record to DNS
|
||||||
|
- Use `~all` (soft fail) for testing, `-all` (hard fail) for production
|
||||||
|
- Keep under 10 DNS lookups
|
||||||
|
|
||||||
|
### DKIM (DomainKeys Identified Mail)
|
||||||
|
|
||||||
|
Cryptographic signature proving email authenticity.
|
||||||
|
|
||||||
|
- Generate keys (provided by email service)
|
||||||
|
- Add public key as TXT record in DNS
|
||||||
|
- Use 2048-bit keys, rotate every 6-12 months
|
||||||
|
|
||||||
|
### DMARC
|
||||||
|
|
||||||
|
Policy for handling SPF/DKIM failures + reporting.
|
||||||
|
|
||||||
|
```
|
||||||
|
v=DMARC1; p=none; rua=mailto:dmarc@yourdomain.com
|
||||||
|
```
|
||||||
|
|
||||||
|
**Rollout:** `p=none` (monitor) → `p=quarantine; pct=25` → `p=reject`
|
||||||
|
|
||||||
|
### BIMI (Optional)
|
||||||
|
|
||||||
|
Display brand logo in email clients. Requires DMARC `p=quarantine` or `p=reject`.
|
||||||
|
|
||||||
|
### Verify Your Setup
|
||||||
|
|
||||||
|
Check DNS records directly:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# SPF record
|
||||||
|
dig TXT yourdomain.com +short
|
||||||
|
|
||||||
|
# DKIM record (replace 'resend' with your selector)
|
||||||
|
dig TXT resend._domainkey.yourdomain.com +short
|
||||||
|
|
||||||
|
# DMARC record
|
||||||
|
dig TXT _dmarc.yourdomain.com +short
|
||||||
|
```
|
||||||
|
|
||||||
|
**Expected output:** Each command should return your configured record. No output = record missing.
|
||||||
|
|
||||||
|
## Sender Reputation
|
||||||
|
|
||||||
|
### IP Warming
|
||||||
|
|
||||||
|
New IP/domain? Gradually increase volume:
|
||||||
|
|
||||||
|
| Week | Daily Volume |
|
||||||
|
|------|-------------|
|
||||||
|
| 1 | 50-100 |
|
||||||
|
| 2 | 200-500 |
|
||||||
|
| 3 | 1,000-2,000 |
|
||||||
|
| 4 | 5,000-10,000 |
|
||||||
|
|
||||||
|
Start with engaged users. Send consistently. Don't rush.
|
||||||
|
|
||||||
|
### Maintaining Reputation
|
||||||
|
|
||||||
|
**Do:** Send to engaged users, keep bounce <2%, complaints <0.1%, remove inactive subscribers
|
||||||
|
|
||||||
|
**Don't:** Send to purchased lists, ignore bounces/complaints, send inconsistent volumes
|
||||||
|
|
||||||
|
## Bounce Handling
|
||||||
|
|
||||||
|
| Type | Cause | Action |
|
||||||
|
|------|-------|--------|
|
||||||
|
| Hard bounce | Invalid email, domain doesn't exist | Remove immediately |
|
||||||
|
| Soft bounce | Mailbox full, server down | Retry: 1h → 4h → 24h, remove after 3-5 failures |
|
||||||
|
|
||||||
|
**Targets:** <2% good, 2-5% acceptable, >5% concerning, >10% critical
|
||||||
|
|
||||||
|
## Complaint Handling
|
||||||
|
|
||||||
|
**Targets:** <0.05% excellent, 0.05-0.1% good, >0.2% critical
|
||||||
|
|
||||||
|
**Reduce complaints:**
|
||||||
|
- Only send to opted-in users
|
||||||
|
- Make unsubscribe easy and immediate
|
||||||
|
- Use clear sender names and "From" addresses
|
||||||
|
|
||||||
|
**Feedback loops:** Set up with Gmail (Postmaster Tools), Yahoo, Microsoft, AOL. Remove complainers immediately.
|
||||||
|
|
||||||
|
## Infrastructure
|
||||||
|
|
||||||
|
**Dedicated sending domain:** Use subdomain (e.g., `mail.yourdomain.com`) to protect main domain reputation.
|
||||||
|
|
||||||
|
**DNS TTL:** Low (300s) during setup, high (3600s+) after stable.
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
**Emails going to spam?** Check in order:
|
||||||
|
1. Authentication (SPF, DKIM, DMARC)
|
||||||
|
2. Sender reputation (blacklists, complaint rates)
|
||||||
|
3. Content (spammy words, HTML issues)
|
||||||
|
4. Sending patterns (sudden volume spikes)
|
||||||
|
|
||||||
|
**Diagnostic tools:** [mail-tester.com](https://mail-tester.com), [mxtoolbox.com](https://mxtoolbox.com), [Google Postmaster Tools](https://postmaster.google.com)
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [List Management](./list-management.md) - Handle bounces and complaints to protect reputation
|
||||||
|
- [Sending Reliability](./sending-reliability.md) - Retry logic and error handling
|
||||||
@@ -0,0 +1,126 @@
|
|||||||
|
# Email Capture Best Practices
|
||||||
|
|
||||||
|
Collecting email addresses responsibly with validation, verification, and proper consent.
|
||||||
|
|
||||||
|
## Email Validation
|
||||||
|
|
||||||
|
### Client-Side
|
||||||
|
|
||||||
|
**HTML5:**
|
||||||
|
```html
|
||||||
|
<input type="email" required>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Validate on blur or with short debounce
|
||||||
|
- Show clear error messages
|
||||||
|
- Don't be too strict (allow unusual but valid formats)
|
||||||
|
- Client-side validation ≠ deliverability
|
||||||
|
|
||||||
|
### Server-Side (Required)
|
||||||
|
|
||||||
|
Always validate server-side—client-side can be bypassed.
|
||||||
|
|
||||||
|
**Check:**
|
||||||
|
- Email format (RFC 5322)
|
||||||
|
- Domain exists (DNS lookup)
|
||||||
|
- Domain has MX records
|
||||||
|
- Optionally: disposable email detection
|
||||||
|
|
||||||
|
## Email Verification
|
||||||
|
|
||||||
|
Confirms address belongs to user and is deliverable.
|
||||||
|
|
||||||
|
### Process
|
||||||
|
|
||||||
|
1. User submits email
|
||||||
|
2. Send verification email with unique link/token
|
||||||
|
3. User clicks link
|
||||||
|
4. Mark as verified
|
||||||
|
5. Allow access/add to list
|
||||||
|
|
||||||
|
**Timing:** Send immediately, include expiration (24-48 hours), allow resend after 60 seconds, limit resend attempts (3/hour).
|
||||||
|
|
||||||
|
## Single vs Double Opt-In
|
||||||
|
|
||||||
|
| | Single Opt-In | Double Opt-In |
|
||||||
|
|--|---------------|---------------|
|
||||||
|
| **Process** | Add to list immediately | Require email confirmation first |
|
||||||
|
| **Pros** | Lower friction, faster growth | Verified addresses, better engagement, meets GDPR/CASL |
|
||||||
|
| **Cons** | Higher invalid rate, lower engagement | Some users don't confirm |
|
||||||
|
| **Use for** | Account creation, transactional | Marketing lists, newsletters |
|
||||||
|
|
||||||
|
**Recommendation:** Double opt-in for all marketing emails.
|
||||||
|
|
||||||
|
## Form Design
|
||||||
|
|
||||||
|
### Email Input
|
||||||
|
|
||||||
|
- Use `type="email"` for mobile keyboard
|
||||||
|
- Include placeholder ("you@example.com")
|
||||||
|
- Clear error messages ("Please enter a valid email address" not "Invalid")
|
||||||
|
|
||||||
|
### Consent Checkboxes (Marketing)
|
||||||
|
|
||||||
|
- **Unchecked by default** (required)
|
||||||
|
- Specific language about what they're signing up for
|
||||||
|
- Separate checkboxes for different email types
|
||||||
|
- Link to privacy policy
|
||||||
|
|
||||||
|
```
|
||||||
|
☐ Subscribe to our weekly newsletter with product updates
|
||||||
|
☐ Send me promotional offers and deals
|
||||||
|
```
|
||||||
|
|
||||||
|
**Don't:** Pre-check boxes, use vague language, hide in terms.
|
||||||
|
|
||||||
|
### Form Layout
|
||||||
|
|
||||||
|
- Keep simple and focused
|
||||||
|
- One primary action
|
||||||
|
- Clear value proposition
|
||||||
|
- Mobile-friendly
|
||||||
|
- Accessible (labels, ARIA)
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Invalid Email
|
||||||
|
|
||||||
|
- Show clear error message
|
||||||
|
- Suggest corrections for common typos (@gmial.com → @gmail.com)
|
||||||
|
- Allow user to fix and resubmit
|
||||||
|
|
||||||
|
### Already Registered
|
||||||
|
|
||||||
|
- Accounts: "This email is already registered. [Sign in]"
|
||||||
|
- Marketing: "You're already subscribed! [Manage preferences]"
|
||||||
|
- Don't reveal if account exists (security)
|
||||||
|
|
||||||
|
### Rate Limiting
|
||||||
|
|
||||||
|
- Limit verification emails (3/hour per email)
|
||||||
|
- Rate limit form submissions
|
||||||
|
- Use CAPTCHA sparingly if needed
|
||||||
|
- Monitor for abuse patterns
|
||||||
|
|
||||||
|
## Verification Emails
|
||||||
|
|
||||||
|
**Content:**
|
||||||
|
- Clear purpose ("Verify your email address")
|
||||||
|
- Prominent verification button
|
||||||
|
- Expiration time
|
||||||
|
- Resend option
|
||||||
|
- "I didn't request this" notice
|
||||||
|
|
||||||
|
**Design:**
|
||||||
|
- Mobile-friendly
|
||||||
|
- Large, tappable button
|
||||||
|
- Clear call-to-action
|
||||||
|
|
||||||
|
See [Transactional Emails](./transactional-emails.md) for detailed email design guidance.
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [Compliance](./compliance.md) - Legal requirements for consent (GDPR, CASL)
|
||||||
|
- [Marketing Emails](./marketing-emails.md) - What happens after capture
|
||||||
|
- [Deliverability](./deliverability.md) - How validation improves sender reputation
|
||||||
@@ -0,0 +1,177 @@
|
|||||||
|
# Email Types: Transactional vs Marketing
|
||||||
|
|
||||||
|
Understanding the difference between transactional and marketing emails is crucial for compliance, deliverability, and user experience. This guide explains the distinctions and provides a catalog of transactional emails your app should include.
|
||||||
|
|
||||||
|
## When to Use This
|
||||||
|
|
||||||
|
- Deciding whether an email should be transactional or marketing
|
||||||
|
- Understanding legal distinctions between email types
|
||||||
|
- Planning what transactional emails your app needs
|
||||||
|
- Ensuring compliance with email regulations
|
||||||
|
- Setting up separate sending infrastructure
|
||||||
|
|
||||||
|
## Transactional vs Marketing: Key Differences
|
||||||
|
|
||||||
|
### Transactional Emails
|
||||||
|
|
||||||
|
**Definition:** Emails that facilitate or confirm a transaction the user initiated or expects. They're directly related to an action the user took.
|
||||||
|
|
||||||
|
**Characteristics:**
|
||||||
|
- User-initiated or expected
|
||||||
|
- Time-sensitive and actionable
|
||||||
|
- Required for the user to complete an action
|
||||||
|
- Not promotional in nature
|
||||||
|
- Can be sent without explicit opt-in (with limitations)
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
- Password reset links
|
||||||
|
- Order confirmations
|
||||||
|
- Account verification
|
||||||
|
- OTP/2FA codes
|
||||||
|
- Shipping notifications
|
||||||
|
|
||||||
|
### Marketing Emails
|
||||||
|
|
||||||
|
**Definition:** Emails sent for promotional, advertising, or informational purposes that are not directly related to a specific transaction.
|
||||||
|
|
||||||
|
**Characteristics:**
|
||||||
|
- Promotional or informational content
|
||||||
|
- Not time-sensitive to complete a transaction
|
||||||
|
- Require explicit opt-in (consent)
|
||||||
|
- Must include unsubscribe options
|
||||||
|
- Subject to stricter compliance requirements
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
- Newsletters
|
||||||
|
- Product announcements
|
||||||
|
- Promotional offers
|
||||||
|
- Company updates
|
||||||
|
- Educational content
|
||||||
|
|
||||||
|
## Legal Distinctions
|
||||||
|
|
||||||
|
### CAN-SPAM Act (US)
|
||||||
|
|
||||||
|
**Transactional emails:**
|
||||||
|
- Can be sent without opt-in
|
||||||
|
- Must be related to a transaction
|
||||||
|
- Cannot contain promotional content (with exceptions)
|
||||||
|
- Must identify sender and provide contact information
|
||||||
|
|
||||||
|
**Marketing emails:**
|
||||||
|
- Require opt-out mechanism (not opt-in in US)
|
||||||
|
- Must include clear sender identification
|
||||||
|
- Must include physical mailing address
|
||||||
|
- Must honor opt-out requests within 10 business days
|
||||||
|
|
||||||
|
### GDPR (EU)
|
||||||
|
|
||||||
|
**Transactional emails:**
|
||||||
|
- Can be sent based on legitimate interest or contract fulfillment
|
||||||
|
- Must be necessary for service delivery
|
||||||
|
- Cannot contain marketing content without consent
|
||||||
|
|
||||||
|
**Marketing emails:**
|
||||||
|
- Require explicit opt-in consent
|
||||||
|
- Must clearly state purpose of data collection
|
||||||
|
- Must provide easy unsubscribe
|
||||||
|
- Subject to data protection requirements
|
||||||
|
|
||||||
|
### CASL (Canada)
|
||||||
|
|
||||||
|
**Transactional emails:**
|
||||||
|
- Can be sent without consent if related to ongoing business relationship
|
||||||
|
- Must be factual and not promotional
|
||||||
|
|
||||||
|
**Marketing emails:**
|
||||||
|
- Require express or implied consent
|
||||||
|
- Must include unsubscribe mechanism
|
||||||
|
- Must identify sender clearly
|
||||||
|
|
||||||
|
## When to Use Each Type
|
||||||
|
|
||||||
|
### Use Transactional When:
|
||||||
|
|
||||||
|
- User needs the email to complete an action
|
||||||
|
- Email confirms a transaction or account change
|
||||||
|
- Email provides security-related information
|
||||||
|
- Email is expected based on user action
|
||||||
|
- Content is time-sensitive and actionable
|
||||||
|
|
||||||
|
### Use Marketing When:
|
||||||
|
|
||||||
|
- Promoting products or services
|
||||||
|
- Sending newsletters or updates
|
||||||
|
- Sharing educational content
|
||||||
|
- Announcing features or company news
|
||||||
|
- Content is not required for a transaction
|
||||||
|
|
||||||
|
## Hybrid Emails: The Gray Area
|
||||||
|
|
||||||
|
Some emails mix transactional and marketing content. Be careful:
|
||||||
|
|
||||||
|
**Best practice:** Keep transactional and marketing separate. If you must include marketing in a transactional email:
|
||||||
|
- Make transactional content primary
|
||||||
|
- Keep marketing content minimal and clearly separated
|
||||||
|
- Ensure transactional purpose is clear
|
||||||
|
- Check local regulations (some regions prohibit this)
|
||||||
|
|
||||||
|
**Example of acceptable hybrid:**
|
||||||
|
- Order confirmation (transactional) with a small "You might also like" section (marketing)
|
||||||
|
|
||||||
|
**Example of problematic hybrid:**
|
||||||
|
- Newsletter (marketing) with a small order status update (transactional)
|
||||||
|
|
||||||
|
## Transactional Email Catalog
|
||||||
|
|
||||||
|
For a complete catalog of transactional emails and recommended combinations by app type, see [Transactional Email Catalog](./transactional-email-catalog.md).
|
||||||
|
|
||||||
|
**Quick reference - Essential emails for most apps:**
|
||||||
|
1. **Email verification** - Required for account creation
|
||||||
|
2. **Password reset** - Required for account recovery
|
||||||
|
3. **Welcome email** - Good user experience
|
||||||
|
|
||||||
|
The catalog includes detailed guidance for:
|
||||||
|
- Authentication-focused apps
|
||||||
|
- Newsletter / content platforms
|
||||||
|
- E-commerce / marketplaces
|
||||||
|
- SaaS / subscription services
|
||||||
|
- Financial / fintech apps
|
||||||
|
- Social / community platforms
|
||||||
|
- Developer tools / API platforms
|
||||||
|
- Healthcare / HIPAA-compliant apps
|
||||||
|
|
||||||
|
## Sending Infrastructure
|
||||||
|
|
||||||
|
### Separate Infrastructure
|
||||||
|
|
||||||
|
**Best practice:** Use separate sending infrastructure for transactional and marketing emails.
|
||||||
|
|
||||||
|
**Benefits:**
|
||||||
|
- Protect transactional deliverability
|
||||||
|
- Different authentication domains
|
||||||
|
- Independent reputation
|
||||||
|
- Easier compliance management
|
||||||
|
|
||||||
|
**Implementation:**
|
||||||
|
- Use different subdomains (e.g., `mail.app.com` for transactional, `news.app.com` for marketing)
|
||||||
|
- Separate email service accounts or API keys
|
||||||
|
- Different monitoring and alerting
|
||||||
|
|
||||||
|
### Email Service Considerations
|
||||||
|
|
||||||
|
Choose an email service that:
|
||||||
|
- Provides reliable delivery for transactional emails
|
||||||
|
- Offers separate sending domains
|
||||||
|
- Has good API for programmatic sending
|
||||||
|
- Provides webhooks for delivery events
|
||||||
|
- Supports authentication setup (SPF, DKIM, DMARC)
|
||||||
|
|
||||||
|
Services like Resend are designed for transactional emails and provide the infrastructure and tools needed for reliable delivery.
|
||||||
|
|
||||||
|
## Related Topics
|
||||||
|
|
||||||
|
- [Transactional Emails](./transactional-emails.md) - Best practices for sending transactional emails
|
||||||
|
- [Marketing Emails](./marketing-emails.md) - Best practices for marketing emails
|
||||||
|
- [Compliance](./compliance.md) - Legal requirements for each email type
|
||||||
|
- [Deliverability](./deliverability.md) - Ensuring transactional emails are delivered
|
||||||
@@ -0,0 +1,157 @@
|
|||||||
|
# List Management
|
||||||
|
|
||||||
|
Maintaining clean email lists through suppression, hygiene, and data retention.
|
||||||
|
|
||||||
|
## Suppression Lists
|
||||||
|
|
||||||
|
A suppression list prevents sending to addresses that should never receive email.
|
||||||
|
|
||||||
|
### What to Suppress
|
||||||
|
|
||||||
|
| Reason | Action | Can Unsuppress? |
|
||||||
|
|--------|--------|-----------------|
|
||||||
|
| Hard bounce | Add immediately | No (address invalid) |
|
||||||
|
| Complaint (spam) | Add immediately | No (legal requirement) |
|
||||||
|
| Unsubscribe | Add immediately | Only if user re-subscribes |
|
||||||
|
| Soft bounce (3x) | Add after threshold | Yes, after 30-90 days |
|
||||||
|
| Manual removal | Add on request | Only if user requests |
|
||||||
|
|
||||||
|
### Implementation
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Suppression list schema
|
||||||
|
interface SuppressionEntry {
|
||||||
|
email: string;
|
||||||
|
reason: 'hard_bounce' | 'complaint' | 'unsubscribe' | 'soft_bounce' | 'manual';
|
||||||
|
created_at: Date;
|
||||||
|
source_email_id?: string; // Which email triggered this
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check before every send
|
||||||
|
async function canSendTo(email: string): Promise<boolean> {
|
||||||
|
const suppressed = await db.suppressions.findOne({ email });
|
||||||
|
return !suppressed;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add to suppression list
|
||||||
|
async function suppressEmail(email: string, reason: string, sourceId?: string) {
|
||||||
|
await db.suppressions.upsert({
|
||||||
|
email: email.toLowerCase(),
|
||||||
|
reason,
|
||||||
|
created_at: new Date(),
|
||||||
|
source_email_id: sourceId,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Pre-Send Check
|
||||||
|
|
||||||
|
**Always check suppression before sending:**
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function sendEmail(to: string, emailData: EmailData) {
|
||||||
|
if (!await canSendTo(to)) {
|
||||||
|
console.log(`Skipping suppressed email: ${to}`);
|
||||||
|
return { skipped: true, reason: 'suppressed' };
|
||||||
|
}
|
||||||
|
|
||||||
|
return await resend.emails.send({ to, ...emailData });
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## List Hygiene
|
||||||
|
|
||||||
|
Regular maintenance to keep lists healthy.
|
||||||
|
|
||||||
|
### Automated Cleanup
|
||||||
|
|
||||||
|
| Task | Frequency | Action |
|
||||||
|
|------|-----------|--------|
|
||||||
|
| Remove hard bounces | Real-time (via webhook) | Immediate suppression |
|
||||||
|
| Remove complaints | Real-time (via webhook) | Immediate suppression |
|
||||||
|
| Process unsubscribes | Real-time | Remove from marketing lists |
|
||||||
|
| Review soft bounces | Daily | Suppress after 3 failures |
|
||||||
|
| Remove inactive | Monthly | Re-engagement → remove |
|
||||||
|
|
||||||
|
### Re-engagement Campaigns
|
||||||
|
|
||||||
|
Before removing inactive subscribers:
|
||||||
|
|
||||||
|
1. **Identify inactive:** No opens/clicks in 90-180 days
|
||||||
|
2. **Send re-engagement:** "We miss you" or "Still interested?"
|
||||||
|
3. **Wait 14-30 days** for response
|
||||||
|
4. **Remove non-responders** from active lists
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function runReengagement() {
|
||||||
|
const inactive = await getInactiveSubscribers(90); // 90 days
|
||||||
|
|
||||||
|
for (const subscriber of inactive) {
|
||||||
|
if (!subscriber.reengagement_sent) {
|
||||||
|
await sendReengagementEmail(subscriber);
|
||||||
|
await markReengagementSent(subscriber.email);
|
||||||
|
} else if (daysSince(subscriber.reengagement_sent) > 30) {
|
||||||
|
await removeFromMarketingLists(subscriber.email);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Data Retention
|
||||||
|
|
||||||
|
### Email Logs
|
||||||
|
|
||||||
|
| Data Type | Recommended Retention | Notes |
|
||||||
|
|-----------|----------------------|-------|
|
||||||
|
| Send attempts | 90 days | Debugging, analytics |
|
||||||
|
| Delivery status | 90 days | Compliance, reporting |
|
||||||
|
| Bounce/complaint events | 3 years | Required for CASL |
|
||||||
|
| Suppression list | Indefinite | Never delete |
|
||||||
|
| Email content | 30 days | Storage costs |
|
||||||
|
| Consent records | 3 years after expiry | Legal requirement |
|
||||||
|
|
||||||
|
### Retention Policy Implementation
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Daily cleanup job
|
||||||
|
async function cleanupOldData() {
|
||||||
|
const now = new Date();
|
||||||
|
|
||||||
|
// Delete old email logs (keep 90 days)
|
||||||
|
await db.emailLogs.deleteMany({
|
||||||
|
created_at: { $lt: subDays(now, 90) }
|
||||||
|
});
|
||||||
|
|
||||||
|
// Delete old email content (keep 30 days)
|
||||||
|
await db.emailContent.deleteMany({
|
||||||
|
created_at: { $lt: subDays(now, 30) }
|
||||||
|
});
|
||||||
|
|
||||||
|
// Never delete: suppressions, consent records
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Metrics to Monitor
|
||||||
|
|
||||||
|
| Metric | Target | Alert Threshold |
|
||||||
|
|--------|--------|-----------------|
|
||||||
|
| Bounce rate | <2% | >5% |
|
||||||
|
| Complaint rate | <0.1% | >0.2% |
|
||||||
|
| Suppression list growth | Stable | Sudden spike |
|
||||||
|
| List churn | <2%/month | >5%/month |
|
||||||
|
|
||||||
|
## Transactional vs Marketing Lists
|
||||||
|
|
||||||
|
**Keep separate:**
|
||||||
|
- Transactional: Can send to anyone with account relationship
|
||||||
|
- Marketing: Only opted-in subscribers
|
||||||
|
|
||||||
|
**Suppression applies to both:** Hard bounces and complaints suppress across all email types.
|
||||||
|
|
||||||
|
**Unsubscribe is marketing-only:** User unsubscribing from marketing can still receive transactional emails (password resets, order confirmations).
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [Webhooks & Events](./webhooks-events.md) - Receive bounce/complaint notifications
|
||||||
|
- [Deliverability](./deliverability.md) - How list hygiene affects sender reputation
|
||||||
|
- [Compliance](./compliance.md) - Legal requirements for data retention
|
||||||
@@ -0,0 +1,115 @@
|
|||||||
|
# Marketing Email Best Practices
|
||||||
|
|
||||||
|
Promotional emails that require explicit consent and provide value to recipients.
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
1. **Consent first** - Explicit opt-in required (especially GDPR/CASL)
|
||||||
|
2. **Value-driven** - Provide useful content, not just promotions
|
||||||
|
3. **Respect preferences** - Let users control frequency and content types
|
||||||
|
|
||||||
|
## Opt-In Requirements
|
||||||
|
|
||||||
|
### Explicit Opt-In
|
||||||
|
|
||||||
|
**What counts:**
|
||||||
|
- User checks unchecked box
|
||||||
|
- User clicks "Subscribe" button
|
||||||
|
- User completes form with clear subscription intent
|
||||||
|
|
||||||
|
**What doesn't count:**
|
||||||
|
- Pre-checked boxes
|
||||||
|
- Opt-out model
|
||||||
|
- Assumed consent from purchase
|
||||||
|
- Purchased/rented lists
|
||||||
|
|
||||||
|
### Informed Consent
|
||||||
|
|
||||||
|
Disclose: email types, frequency, sender identity, how to unsubscribe.
|
||||||
|
|
||||||
|
✅ "Subscribe to our weekly newsletter with product updates and tips"
|
||||||
|
❌ "Sign up for emails"
|
||||||
|
|
||||||
|
### Double Opt-In (Recommended)
|
||||||
|
|
||||||
|
1. User submits email
|
||||||
|
2. Send confirmation email with verification link
|
||||||
|
3. User clicks to confirm
|
||||||
|
4. Add to list only after confirmation
|
||||||
|
|
||||||
|
Benefits: Verifies deliverability, confirms intent, reduces complaints, required in some regions (Germany).
|
||||||
|
|
||||||
|
## Unsubscribe Requirements
|
||||||
|
|
||||||
|
**Must be:**
|
||||||
|
- Prominent in every email
|
||||||
|
- One-click (preferred) or simple process
|
||||||
|
- Immediate (GDPR) or within 10 days (CAN-SPAM)
|
||||||
|
- Free, no login required
|
||||||
|
|
||||||
|
**Preference center options:** Frequency (daily/weekly/monthly), content types, complete unsubscribe.
|
||||||
|
|
||||||
|
## Content and Design
|
||||||
|
|
||||||
|
### Subject Lines
|
||||||
|
|
||||||
|
- Clear and specific (50 chars or less for mobile)
|
||||||
|
- Create curiosity without misleading
|
||||||
|
- A/B test regularly
|
||||||
|
|
||||||
|
✅ "Your weekly digest: 5 productivity tips"
|
||||||
|
❌ "You won't believe what happened!"
|
||||||
|
|
||||||
|
### Structure
|
||||||
|
|
||||||
|
**Above fold:** Value proposition, primary CTA, engaging visual
|
||||||
|
|
||||||
|
**Body:** Scannable (short paragraphs, bullets), clear hierarchy, multiple CTAs
|
||||||
|
|
||||||
|
**Footer:** Unsubscribe link, company info, physical address (CAN-SPAM), social links
|
||||||
|
|
||||||
|
### Mobile-First
|
||||||
|
|
||||||
|
- Single column layout
|
||||||
|
- 44x44px minimum buttons
|
||||||
|
- 16px minimum text
|
||||||
|
- Test on iOS, Android, dark mode
|
||||||
|
|
||||||
|
## Segmentation
|
||||||
|
|
||||||
|
**Segment by:** Behavior (purchases, activity), demographics, preferences, engagement level, signup source.
|
||||||
|
|
||||||
|
Benefits: Higher open/click rates, lower unsubscribes, better experience.
|
||||||
|
|
||||||
|
## Personalization
|
||||||
|
|
||||||
|
**Options:** Name in subject/greeting, location-specific content, behavior-based recommendations, purchase history.
|
||||||
|
|
||||||
|
**Don't over-personalize** - can feel intrusive. Use data you have permission to use.
|
||||||
|
|
||||||
|
## Frequency and Timing
|
||||||
|
|
||||||
|
**Frequency:** Start conservative, increase based on engagement, let users set preferences, monitor unsubscribe rates.
|
||||||
|
|
||||||
|
**Timing:** Weekday mornings (9-11 AM local), Tuesday-Thursday often best. Test your specific audience.
|
||||||
|
|
||||||
|
## List Hygiene
|
||||||
|
|
||||||
|
**Remove immediately:** Hard bounces, unsubscribes, complaints
|
||||||
|
|
||||||
|
**Remove after inactivity:** Send re-engagement campaign first, then remove non-responders
|
||||||
|
|
||||||
|
**Monitor:** Bounce rate <2%, complaint rate <0.1%
|
||||||
|
|
||||||
|
## Required Elements (All Marketing Emails)
|
||||||
|
|
||||||
|
- Clear sender identification
|
||||||
|
- Physical mailing address (CAN-SPAM)
|
||||||
|
- Unsubscribe mechanism
|
||||||
|
- Indication it's marketing (GDPR)
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [Compliance](./compliance.md) - Detailed legal requirements by region
|
||||||
|
- [Email Capture](./email-capture.md) - Collecting consent properly
|
||||||
|
- [List Management](./list-management.md) - Maintaining list hygiene
|
||||||
@@ -0,0 +1,155 @@
|
|||||||
|
# Sending Reliability
|
||||||
|
|
||||||
|
Ensuring emails are sent exactly once and handling failures gracefully.
|
||||||
|
|
||||||
|
## Idempotency
|
||||||
|
|
||||||
|
Prevent duplicate emails when retrying failed requests.
|
||||||
|
|
||||||
|
### The Problem
|
||||||
|
|
||||||
|
Network issues, timeouts, or server errors can leave you uncertain if an email was sent. Retrying without idempotency risks sending duplicates.
|
||||||
|
|
||||||
|
### Solution: Idempotency Keys
|
||||||
|
|
||||||
|
Send a unique key with each request. If the same key is sent again, the server returns the original response instead of sending another email.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Generate deterministic key based on the business event
|
||||||
|
const idempotencyKey = `password-reset-${userId}-${resetRequestId}`;
|
||||||
|
|
||||||
|
await resend.emails.send({
|
||||||
|
from: 'noreply@example.com',
|
||||||
|
to: user.email,
|
||||||
|
subject: 'Reset your password',
|
||||||
|
html: emailHtml,
|
||||||
|
}, {
|
||||||
|
headers: {
|
||||||
|
'Idempotency-Key': idempotencyKey
|
||||||
|
}
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Key Generation Strategies
|
||||||
|
|
||||||
|
| Strategy | Example | Use When |
|
||||||
|
|----------|---------|----------|
|
||||||
|
| Event-based | `order-confirm-${orderId}` | One email per event (recommended) |
|
||||||
|
| Request-scoped | `reset-${userId}-${resetRequestId}` | Retries within same request |
|
||||||
|
| UUID | `crypto.randomUUID()` | No natural key (generate once, reuse on retry) |
|
||||||
|
|
||||||
|
**Best practice:** Use deterministic keys based on the business event. If you retry the same logical send, the same key must be generated. Avoid `Date.now()` or random values generated fresh on each attempt.
|
||||||
|
|
||||||
|
**Key expiration:** Idempotency keys are typically cached for 24 hours. Retries within this window return the original response. After expiration, the same key triggers a new send—so complete your retry logic well within 24 hours.
|
||||||
|
|
||||||
|
## Retry Logic
|
||||||
|
|
||||||
|
Handle transient failures with exponential backoff.
|
||||||
|
|
||||||
|
### When to Retry
|
||||||
|
|
||||||
|
| Error Type | Retry? | Notes |
|
||||||
|
|------------|--------|-------|
|
||||||
|
| 5xx (server error) | ✅ Yes | Transient, likely to resolve |
|
||||||
|
| 429 (rate limit) | ✅ Yes | Wait for rate limit window |
|
||||||
|
| 4xx (client error) | ❌ No | Fix the request first |
|
||||||
|
| Network timeout | ✅ Yes | Transient |
|
||||||
|
| DNS failure | ✅ Yes | May be transient |
|
||||||
|
|
||||||
|
### Exponential Backoff
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function sendWithRetry(emailData, maxRetries = 3) {
|
||||||
|
for (let attempt = 0; attempt < maxRetries; attempt++) {
|
||||||
|
try {
|
||||||
|
return await resend.emails.send(emailData);
|
||||||
|
} catch (error) {
|
||||||
|
if (!isRetryable(error) || attempt === maxRetries - 1) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
const delay = Math.min(1000 * Math.pow(2, attempt), 30000);
|
||||||
|
await sleep(delay + Math.random() * 1000); // Add jitter
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function isRetryable(error) {
|
||||||
|
return error.statusCode >= 500 ||
|
||||||
|
error.statusCode === 429 ||
|
||||||
|
error.code === 'ETIMEDOUT';
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Backoff schedule:** 1s → 2s → 4s → 8s (with jitter to prevent thundering herd)
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Common Error Codes
|
||||||
|
|
||||||
|
| Code | Meaning | Action |
|
||||||
|
|------|---------|--------|
|
||||||
|
| 400 | Bad request | Fix payload (invalid email, missing field) |
|
||||||
|
| 401 | Unauthorized | Check API key |
|
||||||
|
| 403 | Forbidden | Check permissions, domain verification |
|
||||||
|
| 404 | Not found | Check endpoint URL |
|
||||||
|
| 422 | Validation error | Fix request data |
|
||||||
|
| 429 | Rate limited | Back off, retry after delay |
|
||||||
|
| 500 | Server error | Retry with backoff |
|
||||||
|
| 503 | Service unavailable | Retry with backoff |
|
||||||
|
|
||||||
|
### Error Handling Pattern
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
try {
|
||||||
|
const result = await resend.emails.send(emailData);
|
||||||
|
await logSuccess(result.id, emailData);
|
||||||
|
} catch (error) {
|
||||||
|
if (error.statusCode === 429) {
|
||||||
|
await queueForRetry(emailData, error.retryAfter);
|
||||||
|
} else if (error.statusCode >= 500) {
|
||||||
|
await queueForRetry(emailData);
|
||||||
|
} else {
|
||||||
|
await logFailure(error, emailData);
|
||||||
|
await alertOnCriticalEmail(emailData); // For password resets, etc.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Queuing for Reliability
|
||||||
|
|
||||||
|
For critical emails, use a queue to ensure delivery even if the initial send fails.
|
||||||
|
|
||||||
|
**Benefits:**
|
||||||
|
- Survives application restarts
|
||||||
|
- Automatic retry handling
|
||||||
|
- Rate limit management
|
||||||
|
- Audit trail
|
||||||
|
|
||||||
|
**Simple pattern:**
|
||||||
|
1. Write email to queue/database with "pending" status
|
||||||
|
2. Process queue, attempt send
|
||||||
|
3. On success: mark "sent", store message ID
|
||||||
|
4. On retryable failure: increment retry count, schedule retry
|
||||||
|
5. On permanent failure: mark "failed", alert
|
||||||
|
|
||||||
|
## Timeouts
|
||||||
|
|
||||||
|
Set appropriate timeouts to avoid hanging requests.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
const controller = new AbortController();
|
||||||
|
const timeout = setTimeout(() => controller.abort(), 10000);
|
||||||
|
|
||||||
|
try {
|
||||||
|
await resend.emails.send(emailData, { signal: controller.signal });
|
||||||
|
} finally {
|
||||||
|
clearTimeout(timeout);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Recommended:** 10-30 seconds for email API calls.
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [Webhooks & Events](./webhooks-events.md) - Process delivery confirmations and failures
|
||||||
|
- [List Management](./list-management.md) - Handle bounces and suppress invalid addresses
|
||||||
@@ -0,0 +1,418 @@
|
|||||||
|
# Transactional Email Catalog
|
||||||
|
|
||||||
|
A comprehensive catalog of transactional emails organized by category, plus recommended email combinations for different app types.
|
||||||
|
|
||||||
|
## When to Use This
|
||||||
|
|
||||||
|
- Planning what transactional emails your app needs
|
||||||
|
- Choosing the right emails for your app type
|
||||||
|
- Understanding what content each email type should include
|
||||||
|
- Implementing transactional email features
|
||||||
|
|
||||||
|
## Email Combinations by App Type
|
||||||
|
|
||||||
|
Use these combinations as a starting point based on what you're building.
|
||||||
|
|
||||||
|
### Authentication-Focused App
|
||||||
|
|
||||||
|
Apps where user accounts and security are core (login systems, identity providers, account management).
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Security alerts (new device, password change)
|
||||||
|
- Account update notifications
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- Welcome email
|
||||||
|
- Account deletion confirmation
|
||||||
|
|
||||||
|
### Newsletter / Content Platform
|
||||||
|
|
||||||
|
Apps focused on content delivery and subscriptions.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- Welcome email
|
||||||
|
- Subscription confirmation
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Account update notifications
|
||||||
|
|
||||||
|
### E-commerce / Marketplace
|
||||||
|
|
||||||
|
Apps where users buy products or services.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- Welcome email
|
||||||
|
- Order confirmation
|
||||||
|
- Shipping notifications
|
||||||
|
- Invoice / receipt
|
||||||
|
- Payment failed notices
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Security alerts
|
||||||
|
- Subscription confirmations (for recurring orders)
|
||||||
|
|
||||||
|
### SaaS / Subscription Service
|
||||||
|
|
||||||
|
Apps with paid subscription tiers and ongoing billing.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- Welcome email
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Security alerts
|
||||||
|
- Subscription confirmation
|
||||||
|
- Subscription renewal notice
|
||||||
|
- Payment failed notices
|
||||||
|
- Invoice / receipt
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- Account update notifications
|
||||||
|
- Feature change notifications (for breaking changes)
|
||||||
|
|
||||||
|
### Financial / Fintech App
|
||||||
|
|
||||||
|
Apps handling money, payments, or sensitive financial data.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- OTP / 2FA codes (required for sensitive actions)
|
||||||
|
- Security alerts (all types)
|
||||||
|
- Account update notifications
|
||||||
|
- Transaction confirmations
|
||||||
|
- Invoice / receipt
|
||||||
|
- Payment failed notices
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- Welcome email
|
||||||
|
- Compliance notices
|
||||||
|
|
||||||
|
### Social / Community Platform
|
||||||
|
|
||||||
|
Apps focused on user interaction and community features.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- Welcome email
|
||||||
|
- Security alerts
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Account update notifications
|
||||||
|
- Activity notifications (mentions, replies)
|
||||||
|
|
||||||
|
### Developer Tools / API Platform
|
||||||
|
|
||||||
|
Apps targeting developers with API access and integrations.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- OTP / 2FA codes
|
||||||
|
- Security alerts
|
||||||
|
- API key notifications (creation, expiration)
|
||||||
|
- Subscription confirmation
|
||||||
|
- Payment failed notices
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- Welcome email
|
||||||
|
- Usage alerts (approaching limits)
|
||||||
|
- Feature change notifications
|
||||||
|
|
||||||
|
### Healthcare / HIPAA-Compliant App
|
||||||
|
|
||||||
|
Apps handling protected health information.
|
||||||
|
|
||||||
|
**Essential:**
|
||||||
|
- Email verification
|
||||||
|
- Password reset
|
||||||
|
- OTP / 2FA codes (required)
|
||||||
|
- Security alerts (all types, detailed)
|
||||||
|
- Account update notifications
|
||||||
|
- Appointment confirmations
|
||||||
|
|
||||||
|
**Optional:**
|
||||||
|
- Welcome email
|
||||||
|
- Compliance notices
|
||||||
|
|
||||||
|
**Note:** Healthcare apps have strict requirements. Emails should contain minimal PHI and link to secure portals for sensitive information.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Full Email Catalog
|
||||||
|
|
||||||
|
### Authentication & Security
|
||||||
|
|
||||||
|
#### Email Verification / Account Verification
|
||||||
|
|
||||||
|
**When to send:** Immediately after user signs up or changes email address.
|
||||||
|
|
||||||
|
**Purpose:** Verify the email address belongs to the user.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Clear verification link or code
|
||||||
|
- Expiration time (typically 24-48 hours)
|
||||||
|
- Instructions on what to do
|
||||||
|
- Security notice if link is clicked by mistake
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately (within seconds)
|
||||||
|
- Include expiration notice
|
||||||
|
- Provide resend option
|
||||||
|
- Link to support if issues
|
||||||
|
|
||||||
|
#### OTP / 2FA Codes
|
||||||
|
|
||||||
|
**When to send:** When user requests two-factor authentication code.
|
||||||
|
|
||||||
|
**Purpose:** Provide time-sensitive authentication code.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- The OTP code (clearly displayed)
|
||||||
|
- Expiration time (typically 5-10 minutes)
|
||||||
|
- Security warnings
|
||||||
|
- Instructions on what to do if not requested
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately
|
||||||
|
- Code should be large and easy to read
|
||||||
|
- Include expiration prominently
|
||||||
|
- Warn about sharing codes
|
||||||
|
- Provide "I didn't request this" link
|
||||||
|
|
||||||
|
#### Password Reset
|
||||||
|
|
||||||
|
**When to send:** When user requests password reset.
|
||||||
|
|
||||||
|
**Purpose:** Allow user to securely reset forgotten password.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Reset link (with token)
|
||||||
|
- Expiration time (typically 1 hour)
|
||||||
|
- Security warnings
|
||||||
|
- Instructions if not requested
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately
|
||||||
|
- Link expires quickly (1 hour)
|
||||||
|
- Include IP address and location if available
|
||||||
|
- Provide "I didn't request this" link
|
||||||
|
- Don't include the old password
|
||||||
|
|
||||||
|
#### Security Alerts
|
||||||
|
|
||||||
|
**When to send:** When security-relevant events occur (login from new device, password change, etc.).
|
||||||
|
|
||||||
|
**Purpose:** Notify user of account security events.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- What happened (clear description)
|
||||||
|
- When it happened
|
||||||
|
- Location/IP if available
|
||||||
|
- Action to take if suspicious
|
||||||
|
- Link to security settings
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately
|
||||||
|
- Be clear and specific
|
||||||
|
- Include actionable steps
|
||||||
|
- Provide way to report suspicious activity
|
||||||
|
|
||||||
|
### Account Management
|
||||||
|
|
||||||
|
#### Welcome Email
|
||||||
|
|
||||||
|
**When to send:** Immediately after successful account creation and verification.
|
||||||
|
|
||||||
|
**Purpose:** Welcome new users and guide them to next steps.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Welcome message
|
||||||
|
- Key features or next steps
|
||||||
|
- Links to important resources
|
||||||
|
- Support contact information
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send after email verification
|
||||||
|
- Keep it focused and actionable
|
||||||
|
- Don't overwhelm with information
|
||||||
|
- Set expectations about future emails
|
||||||
|
|
||||||
|
#### Account Update Notifications
|
||||||
|
|
||||||
|
**When to send:** When user changes account settings (email, password, profile, etc.).
|
||||||
|
|
||||||
|
**Purpose:** Confirm account changes and provide security notice.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- What changed
|
||||||
|
- When it changed
|
||||||
|
- Action to take if unauthorized
|
||||||
|
- Link to account settings
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately after change
|
||||||
|
- Be specific about what changed
|
||||||
|
- Include security notice
|
||||||
|
- Provide easy way to revert if needed
|
||||||
|
|
||||||
|
### E-commerce & Transactions
|
||||||
|
|
||||||
|
#### Order Confirmations
|
||||||
|
|
||||||
|
**When to send:** Immediately after order is placed.
|
||||||
|
|
||||||
|
**Purpose:** Confirm order details and provide receipt.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Order number
|
||||||
|
- Items ordered with quantities
|
||||||
|
- Pricing breakdown
|
||||||
|
- Shipping address
|
||||||
|
- Estimated delivery date
|
||||||
|
- Order tracking link (if available)
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send within minutes of order
|
||||||
|
- Include all order details
|
||||||
|
- Make it easy to print or save
|
||||||
|
- Provide customer service contact
|
||||||
|
|
||||||
|
#### Shipping Notifications
|
||||||
|
|
||||||
|
**When to send:** When order ships, with tracking updates.
|
||||||
|
|
||||||
|
**Purpose:** Notify user that order has shipped and provide tracking.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Order number
|
||||||
|
- Tracking number
|
||||||
|
- Carrier information
|
||||||
|
- Expected delivery date
|
||||||
|
- Tracking link
|
||||||
|
- Shipping address confirmation
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send when order ships
|
||||||
|
- Include tracking number prominently
|
||||||
|
- Provide carrier tracking link
|
||||||
|
- Update on major tracking milestones
|
||||||
|
|
||||||
|
#### Invoices and Receipts
|
||||||
|
|
||||||
|
**When to send:** After payment is processed.
|
||||||
|
|
||||||
|
**Purpose:** Provide payment confirmation and receipt.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Invoice/receipt number
|
||||||
|
- Payment amount
|
||||||
|
- Payment method
|
||||||
|
- Items/services purchased
|
||||||
|
- Payment date
|
||||||
|
- Downloadable PDF (if applicable)
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately after payment
|
||||||
|
- Include all payment details
|
||||||
|
- Make it easy to download/save
|
||||||
|
- Include tax information if applicable
|
||||||
|
|
||||||
|
### Subscriptions & Billing
|
||||||
|
|
||||||
|
#### Subscription Confirmations
|
||||||
|
|
||||||
|
**When to send:** When user subscribes or changes subscription.
|
||||||
|
|
||||||
|
**Purpose:** Confirm subscription details and billing information.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Subscription plan details
|
||||||
|
- Billing amount and frequency
|
||||||
|
- Next billing date
|
||||||
|
- Payment method
|
||||||
|
- Link to manage subscription
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately after subscription
|
||||||
|
- Clearly state billing terms
|
||||||
|
- Provide easy cancellation option
|
||||||
|
- Include support contact
|
||||||
|
|
||||||
|
#### Subscription Renewal Notices
|
||||||
|
|
||||||
|
**When to send:** Before subscription renews (typically 3-7 days before).
|
||||||
|
|
||||||
|
**Purpose:** Notify user of upcoming renewal and charge.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- Renewal date
|
||||||
|
- Amount to be charged
|
||||||
|
- Payment method on file
|
||||||
|
- Link to update payment method
|
||||||
|
- Link to cancel if desired
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send with enough notice (3-7 days)
|
||||||
|
- Be clear about amount and date
|
||||||
|
- Make it easy to update payment method
|
||||||
|
- Provide cancellation option
|
||||||
|
|
||||||
|
#### Payment Failed Notices
|
||||||
|
|
||||||
|
**When to send:** When subscription payment fails.
|
||||||
|
|
||||||
|
**Purpose:** Notify user of payment failure and provide resolution steps.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- What happened
|
||||||
|
- Amount that failed
|
||||||
|
- Reason for failure (if available)
|
||||||
|
- Steps to resolve
|
||||||
|
- Link to update payment method
|
||||||
|
- Consequences if not resolved
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Send immediately after failure
|
||||||
|
- Be clear about consequences
|
||||||
|
- Provide easy resolution path
|
||||||
|
- Include support contact
|
||||||
|
|
||||||
|
### Notifications & Updates
|
||||||
|
|
||||||
|
#### Feature Announcements (Transactional)
|
||||||
|
|
||||||
|
**When to send:** When a feature the user is using changes significantly.
|
||||||
|
|
||||||
|
**Purpose:** Notify users of changes that affect their use of the service.
|
||||||
|
|
||||||
|
**Content should include:**
|
||||||
|
- What changed
|
||||||
|
- How it affects the user
|
||||||
|
- What action (if any) is needed
|
||||||
|
- Link to more information
|
||||||
|
|
||||||
|
**Best practices:**
|
||||||
|
- Only for significant changes
|
||||||
|
- Focus on user impact
|
||||||
|
- Provide clear next steps
|
||||||
|
- Link to documentation
|
||||||
|
|
||||||
|
**Note:** General feature announcements are marketing emails. Only send as transactional if the change directly affects an active feature the user is using.
|
||||||
|
|
||||||
|
## Related Topics
|
||||||
|
|
||||||
|
- [Email Types](./email-types.md) - Understanding transactional vs marketing
|
||||||
|
- [Transactional Emails](./transactional-emails.md) - Best practices for sending transactional emails
|
||||||
|
- [Compliance](./compliance.md) - Legal requirements for each email type
|
||||||
@@ -0,0 +1,92 @@
|
|||||||
|
# Transactional Email Best Practices
|
||||||
|
|
||||||
|
Clear, actionable emails that users expect and need—password resets, confirmations, OTPs.
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
1. **Clarity over creativity** - Users need to understand and act quickly
|
||||||
|
2. **Action-oriented** - Clear purpose, obvious primary action
|
||||||
|
3. **Time-sensitive** - Send immediately (within seconds)
|
||||||
|
|
||||||
|
## Subject Lines
|
||||||
|
|
||||||
|
**Be specific and include context:**
|
||||||
|
|
||||||
|
| ✅ Good | ❌ Bad |
|
||||||
|
|---------|--------|
|
||||||
|
| Reset your password for [App] | Action required |
|
||||||
|
| Your order #12345 has shipped | Update on your order |
|
||||||
|
| Your 2FA code: 123456 | Security code |
|
||||||
|
| Verify your email for [App] | Verify your email |
|
||||||
|
|
||||||
|
Include identifiers when helpful: order numbers, account names, expiration times.
|
||||||
|
|
||||||
|
## Pre-Header
|
||||||
|
|
||||||
|
The text snippet after subject line. Use it to:
|
||||||
|
- Reinforce subject ("This link expires in 1 hour")
|
||||||
|
- Add urgency or context
|
||||||
|
- Call-to-action preview
|
||||||
|
|
||||||
|
Keep under 90-100 characters.
|
||||||
|
|
||||||
|
## Content Structure
|
||||||
|
|
||||||
|
**Above the fold (first screen):**
|
||||||
|
- Clear purpose
|
||||||
|
- Primary action button
|
||||||
|
- Time-sensitive details (expiration)
|
||||||
|
|
||||||
|
**Hierarchy:** Header → Primary message → Details → Action button → Secondary info
|
||||||
|
|
||||||
|
**Format:** Short paragraphs (2-3 sentences), bullet points, bold for emphasis, white space.
|
||||||
|
|
||||||
|
## Mobile-First Design
|
||||||
|
|
||||||
|
60%+ emails opened on mobile.
|
||||||
|
|
||||||
|
- **Layout:** Single column, stack vertically
|
||||||
|
- **Buttons:** 44x44px minimum, full-width on mobile
|
||||||
|
- **Text:** 16px minimum body, 20-24px headings
|
||||||
|
- **OTP codes:** 24-32px, monospace font
|
||||||
|
|
||||||
|
## Sender Configuration
|
||||||
|
|
||||||
|
| Field | Best Practice | Example |
|
||||||
|
|-------|--------------|---------|
|
||||||
|
| From Name | App/company name, consistent | [App Name] |
|
||||||
|
| From Email | Subdomain, real address | hello@mail.yourdomain.com |
|
||||||
|
| Reply-To | Monitored inbox | support@yourdomain.com |
|
||||||
|
|
||||||
|
Avoid `noreply@` - users reply to transactional emails.
|
||||||
|
|
||||||
|
## Code and Link Display
|
||||||
|
|
||||||
|
**OTP/Verification codes:**
|
||||||
|
- Large (24-32px), monospace font
|
||||||
|
- Centered, clear label
|
||||||
|
- Include expiration nearby
|
||||||
|
- Make copyable
|
||||||
|
|
||||||
|
**Buttons:**
|
||||||
|
- Large, tappable (44x44px+)
|
||||||
|
- Contrasting colors
|
||||||
|
- Clear action text ("Reset Password", "Verify Email")
|
||||||
|
- HTTPS links only
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
**Resend functionality:**
|
||||||
|
- Allow after 60 seconds
|
||||||
|
- Limit attempts (3 per hour)
|
||||||
|
- Show countdown timer
|
||||||
|
|
||||||
|
**Expired links:**
|
||||||
|
- Clear "expired" message
|
||||||
|
- Offer to send new link
|
||||||
|
- Provide support contact
|
||||||
|
|
||||||
|
**"I didn't request this":**
|
||||||
|
- Include in password resets, OTPs, security alerts
|
||||||
|
- Link to security contact
|
||||||
|
- Log clicks for monitoring
|
||||||
@@ -0,0 +1,163 @@
|
|||||||
|
# Webhooks and Events
|
||||||
|
|
||||||
|
Receiving and processing email delivery events in real-time.
|
||||||
|
|
||||||
|
## Event Types
|
||||||
|
|
||||||
|
| Event | When Fired | Use For |
|
||||||
|
|-------|------------|---------|
|
||||||
|
| `email.sent` | Email accepted by Resend | Confirming send initiated |
|
||||||
|
| `email.delivered` | Email delivered to recipient server | Confirming delivery |
|
||||||
|
| `email.bounced` | Email bounced (hard or soft) | List hygiene, alerting |
|
||||||
|
| `email.complained` | Recipient marked as spam | Immediate unsubscribe |
|
||||||
|
| `email.opened` | Recipient opened email | Engagement tracking |
|
||||||
|
| `email.clicked` | Recipient clicked link | Engagement tracking |
|
||||||
|
|
||||||
|
## Webhook Setup
|
||||||
|
|
||||||
|
### 1. Create Endpoint
|
||||||
|
|
||||||
|
Your endpoint must:
|
||||||
|
- Accept POST requests
|
||||||
|
- Return 2xx status quickly (within 5 seconds)
|
||||||
|
- Handle duplicate events (idempotent processing)
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
app.post('/webhooks/resend', async (req, res) => {
|
||||||
|
// Return 200 immediately to acknowledge receipt
|
||||||
|
res.status(200).send('OK');
|
||||||
|
|
||||||
|
// Process asynchronously
|
||||||
|
processWebhookAsync(req.body).catch(console.error);
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Verify Signatures
|
||||||
|
|
||||||
|
Always verify webhook signatures to prevent spoofing.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
import { Webhook } from 'svix';
|
||||||
|
|
||||||
|
const webhook = new Webhook(process.env.RESEND_WEBHOOK_SECRET);
|
||||||
|
|
||||||
|
app.post('/webhooks/resend', (req, res) => {
|
||||||
|
try {
|
||||||
|
const payload = webhook.verify(
|
||||||
|
JSON.stringify(req.body),
|
||||||
|
{
|
||||||
|
'svix-id': req.headers['svix-id'],
|
||||||
|
'svix-timestamp': req.headers['svix-timestamp'],
|
||||||
|
'svix-signature': req.headers['svix-signature'],
|
||||||
|
}
|
||||||
|
);
|
||||||
|
// Process verified payload
|
||||||
|
} catch (err) {
|
||||||
|
return res.status(400).send('Invalid signature');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Register Webhook URL
|
||||||
|
|
||||||
|
Configure your webhook endpoint in the Resend dashboard or via API.
|
||||||
|
|
||||||
|
## Processing Events
|
||||||
|
|
||||||
|
### Bounce Handling
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function handleBounce(event) {
|
||||||
|
const { email_id, email, bounce_type } = event.data;
|
||||||
|
|
||||||
|
if (bounce_type === 'hard') {
|
||||||
|
// Permanent failure - remove from all lists
|
||||||
|
await suppressEmail(email, 'hard_bounce');
|
||||||
|
await removeFromAllLists(email);
|
||||||
|
} else {
|
||||||
|
// Soft bounce - track and remove after threshold
|
||||||
|
await incrementSoftBounce(email);
|
||||||
|
const count = await getSoftBounceCount(email);
|
||||||
|
if (count >= 3) {
|
||||||
|
await suppressEmail(email, 'soft_bounce_limit');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Complaint Handling
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function handleComplaint(event) {
|
||||||
|
const { email } = event.data;
|
||||||
|
|
||||||
|
// Immediate suppression - no exceptions
|
||||||
|
await suppressEmail(email, 'complaint');
|
||||||
|
await removeFromAllLists(email);
|
||||||
|
await logComplaint(event); // For analysis
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Delivery Confirmation
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function handleDelivered(event) {
|
||||||
|
const { email_id } = event.data;
|
||||||
|
await updateEmailStatus(email_id, 'delivered');
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Idempotent Processing
|
||||||
|
|
||||||
|
Webhooks may be sent multiple times. Use event IDs to prevent duplicate processing.
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
async function processWebhook(event) {
|
||||||
|
const eventId = event.id;
|
||||||
|
|
||||||
|
// Check if already processed
|
||||||
|
if (await isEventProcessed(eventId)) {
|
||||||
|
return; // Skip duplicate
|
||||||
|
}
|
||||||
|
|
||||||
|
// Process event
|
||||||
|
await handleEvent(event);
|
||||||
|
|
||||||
|
// Mark as processed
|
||||||
|
await markEventProcessed(eventId);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Retry Behavior
|
||||||
|
|
||||||
|
If your endpoint returns non-2xx, webhooks will retry with exponential backoff:
|
||||||
|
- Retry 1: ~30 seconds
|
||||||
|
- Retry 2: ~1 minute
|
||||||
|
- Retry 3: ~5 minutes
|
||||||
|
- (continues for ~24 hours)
|
||||||
|
|
||||||
|
### Best Practices
|
||||||
|
|
||||||
|
- **Return 200 quickly** - Process asynchronously to avoid timeouts
|
||||||
|
- **Be idempotent** - Handle duplicate deliveries gracefully
|
||||||
|
- **Log everything** - Store raw events for debugging
|
||||||
|
- **Alert on failures** - Monitor webhook processing errors
|
||||||
|
- **Queue for processing** - Use a job queue for complex handling
|
||||||
|
|
||||||
|
## Testing Webhooks
|
||||||
|
|
||||||
|
**Local development:** Use ngrok or similar to expose localhost.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ngrok http 3000
|
||||||
|
# Use the ngrok URL as your webhook endpoint
|
||||||
|
```
|
||||||
|
|
||||||
|
**Verify handling:** Send test events through Resend dashboard or manually trigger each event type.
|
||||||
|
|
||||||
|
## Related
|
||||||
|
|
||||||
|
- [List Management](./list-management.md) - What to do with bounce/complaint data
|
||||||
|
- [Sending Reliability](./sending-reliability.md) - Retry logic when sends fail
|
||||||
41
profiles/opencode/skill/frontend-design/SKILL.md
Normal file
41
profiles/opencode/skill/frontend-design/SKILL.md
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
---
|
||||||
|
name: frontend-design
|
||||||
|
description: Create distinctive, production-grade frontend interfaces with high design quality. Use this skill when the user asks to build web components, pages, artifacts, posters, or applications (examples include websites, landing pages, dashboards, React components, HTML/CSS layouts, or when styling/beautifying any web UI). Generates creative, polished code and UI design that avoids generic AI aesthetics.
|
||||||
|
---
|
||||||
|
|
||||||
|
This skill guides creation of distinctive, production-grade frontend interfaces that avoid generic "AI slop" aesthetics. Implement real working code with exceptional attention to aesthetic details and creative choices.
|
||||||
|
|
||||||
|
The user provides frontend requirements: a component, page, application, or interface to build. They may include context about the purpose, audience, or technical constraints.
|
||||||
|
|
||||||
|
## Design Thinking
|
||||||
|
|
||||||
|
Before coding, understand the context and commit to a BOLD aesthetic direction:
|
||||||
|
- **Purpose**: What problem does this interface solve? Who uses it?
|
||||||
|
- **Tone**: Pick an extreme: brutally minimal, maximalist chaos, retro-futuristic, organic/natural, luxury/refined, playful/toy-like, editorial/magazine, brutalist/raw, art deco/geometric, soft/pastel, industrial/utilitarian, etc. There are so many flavors to choose from. Use these for inspiration but design one that is true to the aesthetic direction.
|
||||||
|
- **Constraints**: Technical requirements (framework, performance, accessibility).
|
||||||
|
- **Differentiation**: What makes this UNFORGETTABLE? What's the one thing someone will remember?
|
||||||
|
|
||||||
|
**CRITICAL**: Choose a clear conceptual direction and execute it with precision. Bold maximalism and refined minimalism both work - the key is intentionality, not intensity.
|
||||||
|
|
||||||
|
Then implement working code (HTML/CSS/JS, React, Vue, etc.) that is:
|
||||||
|
- Production-grade and functional
|
||||||
|
- Visually striking and memorable
|
||||||
|
- Cohesive with a clear aesthetic point-of-view
|
||||||
|
- Meticulously refined in every detail
|
||||||
|
|
||||||
|
## Frontend Aesthetics Guidelines
|
||||||
|
|
||||||
|
Focus on:
|
||||||
|
- **Typography**: Choose fonts that are beautiful, unique, and interesting. Avoid generic fonts like Arial and Inter; opt instead for distinctive choices that elevate the frontend's aesthetics; unexpected, characterful font choices. Pair a distinctive display font with a refined body font.
|
||||||
|
- **Color & Theme**: Commit to a cohesive aesthetic. Use CSS variables for consistency. Dominant colors with sharp accents outperform timid, evenly-distributed palettes.
|
||||||
|
- **Motion**: Use animations for effects and micro-interactions. Prioritize CSS-only solutions for HTML. Use Motion library for React when available. Focus on high-impact moments: one well-orchestrated page load with staggered reveals (animation-delay) creates more delight than scattered micro-interactions. Use scroll-triggering and hover states that surprise.
|
||||||
|
- **Spatial Composition**: Unexpected layouts. Asymmetry. Overlap. Diagonal flow. Grid-breaking elements. Generous negative space OR controlled density.
|
||||||
|
- **Backgrounds & Visual Details**: Create atmosphere and depth rather than defaulting to solid colors. Add contextual effects and textures that match the overall aesthetic. Apply creative forms like gradient meshes, noise textures, geometric patterns, layered transparencies, dramatic shadows, decorative borders, custom cursors, and grain overlays.
|
||||||
|
|
||||||
|
NEVER use generic AI-generated aesthetics like overused font families (Inter, Roboto, Arial, system fonts), cliched color schemes (particularly purple gradients on white backgrounds), predictable layouts and component patterns, and cookie-cutter design that lacks context-specific character.
|
||||||
|
|
||||||
|
Interpret creatively and make unexpected choices that feel genuinely designed for the context. No design should be the same. Vary between light and dark themes, different fonts, different aesthetics. NEVER converge on common choices (Space Grotesk, for example) across generations.
|
||||||
|
|
||||||
|
**IMPORTANT**: Match implementation complexity to the aesthetic vision. Maximalist designs need elaborate code with extensive animations and effects. Minimalist or refined designs need restraint, precision, and careful attention to spacing, typography, and subtle details. Elegance comes from executing the vision well.
|
||||||
|
|
||||||
|
Remember: Claude is capable of extraordinary creative work. Don't hold back, show what can truly be created when thinking outside the box and committing fully to a distinctive vision.
|
||||||
123
profiles/opencode/skill/librarian/SKILL.md
Normal file
123
profiles/opencode/skill/librarian/SKILL.md
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
---
|
||||||
|
name: librarian
|
||||||
|
description: Multi-repository codebase exploration. Research library internals, find code patterns, understand architecture, compare implementations across GitHub/npm/PyPI/crates. Use when needing deep understanding of how libraries work, finding implementations across open source, or exploring remote repository structure.
|
||||||
|
references:
|
||||||
|
- references/tool-routing.md
|
||||||
|
- references/opensrc-api.md
|
||||||
|
- references/opensrc-examples.md
|
||||||
|
- references/linking.md
|
||||||
|
- references/diagrams.md
|
||||||
|
---
|
||||||
|
|
||||||
|
# Librarian Skill
|
||||||
|
|
||||||
|
Deep codebase exploration across remote repositories.
|
||||||
|
|
||||||
|
## How to Use This Skill
|
||||||
|
|
||||||
|
### Reference Structure
|
||||||
|
|
||||||
|
| File | Purpose | When to Read |
|
||||||
|
|------|---------|--------------|
|
||||||
|
| `tool-routing.md` | Tool selection decision trees | **Always read first** |
|
||||||
|
| `opensrc-api.md` | API reference, types | Writing opensrc code |
|
||||||
|
| `opensrc-examples.md` | JavaScript patterns, workflows | Implementation examples |
|
||||||
|
| `linking.md` | GitHub URL patterns | Formatting responses |
|
||||||
|
| `diagrams.md` | Mermaid patterns | Visualizing architecture |
|
||||||
|
|
||||||
|
### Reading Order
|
||||||
|
|
||||||
|
1. **Start** with `tool-routing.md` → choose tool strategy
|
||||||
|
2. **If using opensrc:**
|
||||||
|
- Read `opensrc-api.md` for API details
|
||||||
|
- Read `opensrc-examples.md` for patterns
|
||||||
|
3. **Before responding:** `linking.md` + `diagrams.md` for output formatting
|
||||||
|
|
||||||
|
## Tool Arsenal
|
||||||
|
|
||||||
|
| Tool | Best For | Limitations |
|
||||||
|
|------|----------|-------------|
|
||||||
|
| **grep_app** | Find patterns across ALL public GitHub | Literal search only |
|
||||||
|
| **context7** | Library docs, API examples, usage | Known libraries only |
|
||||||
|
| **opensrc** | Fetch full source for deep exploration | Must fetch before read |
|
||||||
|
|
||||||
|
## Quick Decision Trees
|
||||||
|
|
||||||
|
### "How does X work?"
|
||||||
|
|
||||||
|
```
|
||||||
|
Known library?
|
||||||
|
├─ Yes → context7.resolve-library-id → context7.query-docs
|
||||||
|
│ └─ Need internals? → opensrc.fetch → read source
|
||||||
|
└─ No → grep_app search → opensrc.fetch top result
|
||||||
|
```
|
||||||
|
|
||||||
|
### "Find pattern X"
|
||||||
|
|
||||||
|
```
|
||||||
|
Specific repo?
|
||||||
|
├─ Yes → opensrc.fetch → opensrc.grep → read matches
|
||||||
|
└─ No → grep_app (broad) → opensrc.fetch interesting repos
|
||||||
|
```
|
||||||
|
|
||||||
|
### "Explore repo structure"
|
||||||
|
|
||||||
|
```
|
||||||
|
1. opensrc.fetch(target)
|
||||||
|
2. opensrc.tree(source.name) → quick overview
|
||||||
|
3. opensrc.files(source.name, "**/*.ts") → detailed listing
|
||||||
|
4. Read: README, package.json, src/index.*
|
||||||
|
5. Create architecture diagram (see diagrams.md)
|
||||||
|
```
|
||||||
|
|
||||||
|
### "Compare X vs Y"
|
||||||
|
|
||||||
|
```
|
||||||
|
1. opensrc.fetch(["X", "Y"])
|
||||||
|
2. Use source.name from results for subsequent calls
|
||||||
|
3. opensrc.grep(pattern, { sources: [nameX, nameY] })
|
||||||
|
4. Read comparable files, synthesize differences
|
||||||
|
```
|
||||||
|
|
||||||
|
## Critical: Source Naming Convention
|
||||||
|
|
||||||
|
**After fetching, always use `source.name` for subsequent calls:**
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const [{ source }] = await opensrc.fetch("vercel/ai");
|
||||||
|
const files = await opensrc.files(source.name, "**/*.ts");
|
||||||
|
```
|
||||||
|
|
||||||
|
| Type | Fetch Spec | Source Name |
|
||||||
|
|------|------------|-------------|
|
||||||
|
| npm | `"zod"` | `"zod"` |
|
||||||
|
| npm scoped | `"@tanstack/react-query"` | `"@tanstack/react-query"` |
|
||||||
|
| pypi | `"pypi:requests"` | `"requests"` |
|
||||||
|
| crates | `"crates:serde"` | `"serde"` |
|
||||||
|
| GitHub | `"vercel/ai"` | `"github.com/vercel/ai"` |
|
||||||
|
| GitLab | `"gitlab:org/repo"` | `"gitlab.com/org/repo"` |
|
||||||
|
|
||||||
|
## When NOT to Use opensrc
|
||||||
|
|
||||||
|
| Scenario | Use Instead |
|
||||||
|
|----------|-------------|
|
||||||
|
| Simple library API questions | context7 |
|
||||||
|
| Finding examples across many repos | grep_app |
|
||||||
|
| Very large monorepos (>10GB) | Clone locally |
|
||||||
|
| Private repositories | Direct access |
|
||||||
|
|
||||||
|
## Output Guidelines
|
||||||
|
|
||||||
|
1. **Comprehensive final message** - only last message returns to main agent
|
||||||
|
2. **Parallel tool calls** - maximize efficiency
|
||||||
|
3. **Link every file reference** - see `linking.md`
|
||||||
|
4. **Diagram complex relationships** - see `diagrams.md`
|
||||||
|
5. **Never mention tool names** - say "I'll search" not "I'll use opensrc"
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
- [Tool Routing Decision Trees](references/tool-routing.md)
|
||||||
|
- [opensrc API Reference](references/opensrc-api.md)
|
||||||
|
- [opensrc Code Examples](references/opensrc-examples.md)
|
||||||
|
- [GitHub Linking Patterns](references/linking.md)
|
||||||
|
- [Mermaid Diagram Patterns](references/diagrams.md)
|
||||||
51
profiles/opencode/skill/librarian/references/diagrams.md
Normal file
51
profiles/opencode/skill/librarian/references/diagrams.md
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
# Mermaid Diagram Patterns
|
||||||
|
|
||||||
|
Create diagrams for:
|
||||||
|
- Architecture (component relationships)
|
||||||
|
- Data flow (request → response)
|
||||||
|
- Dependencies (import graph)
|
||||||
|
- Sequences (step-by-step processes)
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
A[Client] --> B[API Gateway]
|
||||||
|
B --> C[Auth Service]
|
||||||
|
B --> D[Data Service]
|
||||||
|
D --> E[(Database)]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Flow
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
Input --> Parse --> Validate --> Transform --> Output
|
||||||
|
```
|
||||||
|
|
||||||
|
## Sequence
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
Client->>+Server: Request
|
||||||
|
Server->>+DB: Query
|
||||||
|
DB-->>-Server: Result
|
||||||
|
Server-->>-Client: Response
|
||||||
|
```
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
| Type | Use For |
|
||||||
|
|------|---------|
|
||||||
|
| `graph TD` | Component hierarchy, dependencies |
|
||||||
|
| `flowchart LR` | Data transformation, pipelines |
|
||||||
|
| `sequenceDiagram` | Request/response, multi-party interaction |
|
||||||
|
| `classDiagram` | Type relationships, inheritance |
|
||||||
|
| `stateDiagram` | State machines, lifecycle |
|
||||||
|
|
||||||
|
## Tips
|
||||||
|
|
||||||
|
- Keep nodes short (3-4 words max)
|
||||||
|
- Use subgraphs for grouping related components
|
||||||
|
- Arrow labels for relationship types
|
||||||
|
- Prefer LR (left-right) for flows, TD (top-down) for hierarchies
|
||||||
61
profiles/opencode/skill/librarian/references/linking.md
Normal file
61
profiles/opencode/skill/librarian/references/linking.md
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
# GitHub Linking Patterns
|
||||||
|
|
||||||
|
All file/dir/code refs → fluent markdown links. Never raw URLs.
|
||||||
|
|
||||||
|
## URL Formats
|
||||||
|
|
||||||
|
### File
|
||||||
|
```
|
||||||
|
https://github.com/{owner}/{repo}/blob/{ref}/{path}
|
||||||
|
```
|
||||||
|
|
||||||
|
### File + Lines
|
||||||
|
```
|
||||||
|
https://github.com/{owner}/{repo}/blob/{ref}/{path}#L{start}-L{end}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Directory
|
||||||
|
```
|
||||||
|
https://github.com/{owner}/{repo}/tree/{ref}/{path}
|
||||||
|
```
|
||||||
|
|
||||||
|
### GitLab (note `/-/blob/`)
|
||||||
|
```
|
||||||
|
https://gitlab.com/{owner}/{repo}/-/blob/{ref}/{path}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Ref Resolution
|
||||||
|
|
||||||
|
| Source | Use as ref |
|
||||||
|
|--------|------------|
|
||||||
|
| Known version | `v{version}` |
|
||||||
|
| Default branch | `main` or `master` |
|
||||||
|
| opensrc fetch | ref from result |
|
||||||
|
| Specific commit | full SHA |
|
||||||
|
|
||||||
|
## Examples
|
||||||
|
|
||||||
|
### Correct
|
||||||
|
```markdown
|
||||||
|
The [`parseAsync`](https://github.com/colinhacks/zod/blob/main/src/types.ts#L450-L480) method handles...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Wrong
|
||||||
|
```markdown
|
||||||
|
See https://github.com/colinhacks/zod/blob/main/src/types.ts#L100
|
||||||
|
The parseAsync method in src/types.ts handles...
|
||||||
|
```
|
||||||
|
|
||||||
|
## Line Numbers
|
||||||
|
|
||||||
|
- Single: `#L42`
|
||||||
|
- Range: `#L42-L50`
|
||||||
|
- Prefer ranges for context (2-5 lines around key code)
|
||||||
|
|
||||||
|
## Registry → GitHub
|
||||||
|
|
||||||
|
| Registry | Find repo in |
|
||||||
|
|----------|--------------|
|
||||||
|
| npm | `package.json` → `repository` |
|
||||||
|
| PyPI | `pyproject.toml` or setup.py |
|
||||||
|
| crates | `Cargo.toml` |
|
||||||
235
profiles/opencode/skill/librarian/references/opensrc-api.md
Normal file
235
profiles/opencode/skill/librarian/references/opensrc-api.md
Normal file
@@ -0,0 +1,235 @@
|
|||||||
|
# opensrc API Reference
|
||||||
|
|
||||||
|
## Tool
|
||||||
|
|
||||||
|
Use the **opensrc MCP server** via single tool:
|
||||||
|
|
||||||
|
| Tool | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `opensrc_execute` | All operations (fetch, read, grep, files, remove, etc.) |
|
||||||
|
|
||||||
|
Takes a `code` parameter: JavaScript async arrow function executed server-side. Source trees stay on server, only results return.
|
||||||
|
|
||||||
|
## API Surface
|
||||||
|
|
||||||
|
### Read Operations
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// List all fetched sources
|
||||||
|
opensrc.list(): Source[]
|
||||||
|
|
||||||
|
// Check if source exists
|
||||||
|
opensrc.has(name: string, version?: string): boolean
|
||||||
|
|
||||||
|
// Get source metadata
|
||||||
|
opensrc.get(name: string): Source | undefined
|
||||||
|
|
||||||
|
// List files with optional glob
|
||||||
|
opensrc.files(sourceName: string, glob?: string): Promise<FileEntry[]>
|
||||||
|
|
||||||
|
// Get directory tree structure (default depth: 3)
|
||||||
|
opensrc.tree(sourceName: string, options?: { depth?: number }): Promise<TreeNode>
|
||||||
|
|
||||||
|
// Regex search file contents
|
||||||
|
opensrc.grep(pattern: string, options?: GrepOptions): Promise<GrepResult[]>
|
||||||
|
|
||||||
|
// AST-based semantic code search
|
||||||
|
opensrc.astGrep(sourceName: string, pattern: string, options?: AstGrepOptions): Promise<AstGrepMatch[]>
|
||||||
|
|
||||||
|
// Read single file
|
||||||
|
opensrc.read(sourceName: string, filePath: string): Promise<string>
|
||||||
|
|
||||||
|
// Batch read multiple files (supports globs!)
|
||||||
|
opensrc.readMany(sourceName: string, paths: string[]): Promise<Record<string, string>>
|
||||||
|
|
||||||
|
// Parse fetch spec
|
||||||
|
opensrc.resolve(spec: string): Promise<ParsedSpec>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Mutation Operations
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Fetch packages/repos
|
||||||
|
opensrc.fetch(specs: string | string[], options?: { modify?: boolean }): Promise<FetchedSource[]>
|
||||||
|
|
||||||
|
// Remove sources
|
||||||
|
opensrc.remove(names: string[]): Promise<RemoveResult>
|
||||||
|
|
||||||
|
// Clean by type
|
||||||
|
opensrc.clean(options?: CleanOptions): Promise<RemoveResult>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Types
|
||||||
|
|
||||||
|
### Source
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface Source {
|
||||||
|
type: "npm" | "pypi" | "crates" | "repo";
|
||||||
|
name: string; // Use this for all subsequent calls
|
||||||
|
version?: string;
|
||||||
|
ref?: string;
|
||||||
|
path: string;
|
||||||
|
fetchedAt: string;
|
||||||
|
repository: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### FetchedSource
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface FetchedSource {
|
||||||
|
source: Source; // IMPORTANT: use source.name for subsequent calls
|
||||||
|
alreadyExists: boolean;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### GrepOptions
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface GrepOptions {
|
||||||
|
sources?: string[]; // Filter to specific sources
|
||||||
|
include?: string; // File glob pattern (e.g., "*.ts")
|
||||||
|
maxResults?: number; // Limit results (default: 100)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### GrepResult
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface GrepResult {
|
||||||
|
source: string;
|
||||||
|
file: string;
|
||||||
|
line: number;
|
||||||
|
content: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### AstGrepOptions
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface AstGrepOptions {
|
||||||
|
glob?: string; // File glob pattern (e.g., "**/*.ts")
|
||||||
|
lang?: string | string[]; // Language(s): "js", "ts", "tsx", "html", "css"
|
||||||
|
limit?: number; // Max results (default: 1000)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### AstGrepMatch
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface AstGrepMatch {
|
||||||
|
file: string;
|
||||||
|
line: number;
|
||||||
|
column: number;
|
||||||
|
endLine: number;
|
||||||
|
endColumn: number;
|
||||||
|
text: string; // Matched code text
|
||||||
|
metavars: Record<string, string>; // Captured $VAR → text
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### AST Pattern Syntax
|
||||||
|
|
||||||
|
| Pattern | Matches |
|
||||||
|
|---------|---------|
|
||||||
|
| `$NAME` | Single node, captures to metavars |
|
||||||
|
| `$$$ARGS` | Zero or more nodes (variadic), captures |
|
||||||
|
| `$_` | Single node, no capture |
|
||||||
|
| `$$$` | Zero or more nodes, no capture |
|
||||||
|
|
||||||
|
### FileEntry
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface FileEntry {
|
||||||
|
path: string;
|
||||||
|
size: number;
|
||||||
|
isDirectory: boolean;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### TreeNode
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface TreeNode {
|
||||||
|
name: string;
|
||||||
|
type: "file" | "dir";
|
||||||
|
children?: TreeNode[]; // only for dirs
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### CleanOptions
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface CleanOptions {
|
||||||
|
packages?: boolean;
|
||||||
|
repos?: boolean;
|
||||||
|
npm?: boolean;
|
||||||
|
pypi?: boolean;
|
||||||
|
crates?: boolean;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### RemoveResult
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface RemoveResult {
|
||||||
|
success: boolean;
|
||||||
|
removed: string[];
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
Operations throw on errors. Wrap in try/catch if needed:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
try {
|
||||||
|
const content = await opensrc.read("zod", "missing.ts");
|
||||||
|
return content;
|
||||||
|
} catch (e) {
|
||||||
|
return { error: e.message };
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`readMany` returns errors as string values prefixed with `[Error:`:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const files = await opensrc.readMany("zod", ["exists.ts", "missing.ts"]);
|
||||||
|
// { "exists.ts": "content...", "missing.ts": "[Error: ENOENT...]" }
|
||||||
|
|
||||||
|
// Filter successful reads
|
||||||
|
const successful = Object.entries(files)
|
||||||
|
.filter(([_, content]) => !content.startsWith("[Error:"));
|
||||||
|
```
|
||||||
|
|
||||||
|
## Package Spec Formats
|
||||||
|
|
||||||
|
| Format | Example | Source Name After Fetch |
|
||||||
|
|--------|---------|------------------------|
|
||||||
|
| `<name>` | `"zod"` | `"zod"` |
|
||||||
|
| `<name>@<version>` | `"zod@3.22.0"` | `"zod"` |
|
||||||
|
| `pypi:<name>` | `"pypi:requests"` | `"requests"` |
|
||||||
|
| `crates:<name>` | `"crates:serde"` | `"serde"` |
|
||||||
|
| `owner/repo` | `"vercel/ai"` | `"github.com/vercel/ai"` |
|
||||||
|
| `owner/repo@ref` | `"vercel/ai@v1.0.0"` | `"github.com/vercel/ai"` |
|
||||||
|
| `gitlab:owner/repo` | `"gitlab:org/repo"` | `"gitlab.com/org/repo"` |
|
||||||
|
|
||||||
|
## Critical Pattern
|
||||||
|
|
||||||
|
**Always capture `source.name` from fetch results:**
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("vercel/ai");
|
||||||
|
|
||||||
|
// GitHub repos: "vercel/ai" → "github.com/vercel/ai"
|
||||||
|
const sourceName = source.name;
|
||||||
|
|
||||||
|
// Use sourceName for ALL subsequent calls
|
||||||
|
const files = await opensrc.files(sourceName, "src/**/*.ts");
|
||||||
|
return files;
|
||||||
|
}
|
||||||
|
```
|
||||||
336
profiles/opencode/skill/librarian/references/opensrc-examples.md
Normal file
336
profiles/opencode/skill/librarian/references/opensrc-examples.md
Normal file
@@ -0,0 +1,336 @@
|
|||||||
|
# opensrc Code Examples
|
||||||
|
|
||||||
|
## Workflow: Fetch → Explore
|
||||||
|
|
||||||
|
### Basic Fetch and Explore with tree()
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("vercel/ai");
|
||||||
|
// Get directory structure first
|
||||||
|
const tree = await opensrc.tree(source.name, { depth: 2 });
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Fetch and Read Key Files
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("vercel/ai");
|
||||||
|
const sourceName = source.name; // "github.com/vercel/ai"
|
||||||
|
|
||||||
|
const files = await opensrc.readMany(sourceName, [
|
||||||
|
"package.json",
|
||||||
|
"README.md",
|
||||||
|
"src/index.ts"
|
||||||
|
]);
|
||||||
|
|
||||||
|
return { sourceName, files };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### readMany with Globs
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("zod");
|
||||||
|
// Read all package.json files in monorepo
|
||||||
|
const files = await opensrc.readMany(source.name, [
|
||||||
|
"packages/*/package.json" // globs supported!
|
||||||
|
]);
|
||||||
|
return Object.keys(files);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Batch Fetch Multiple Packages
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const results = await opensrc.fetch(["zod", "valibot", "yup"]);
|
||||||
|
const names = results.map(r => r.source.name);
|
||||||
|
|
||||||
|
// Compare how each handles string validation
|
||||||
|
const comparisons = {};
|
||||||
|
for (const name of names) {
|
||||||
|
const matches = await opensrc.grep("string.*validate|validateString", {
|
||||||
|
sources: [name],
|
||||||
|
include: "*.ts",
|
||||||
|
maxResults: 10
|
||||||
|
});
|
||||||
|
comparisons[name] = matches.map(m => `${m.file}:${m.line}`);
|
||||||
|
}
|
||||||
|
return comparisons;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Search Patterns
|
||||||
|
|
||||||
|
### Grep → Read Context
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const matches = await opensrc.grep("export function parse\\(", {
|
||||||
|
sources: ["zod"],
|
||||||
|
include: "*.ts"
|
||||||
|
});
|
||||||
|
|
||||||
|
if (matches.length === 0) return "No matches";
|
||||||
|
|
||||||
|
const match = matches[0];
|
||||||
|
const content = await opensrc.read(match.source, match.file);
|
||||||
|
const lines = content.split("\n");
|
||||||
|
|
||||||
|
// Return 40 lines starting from match
|
||||||
|
return {
|
||||||
|
file: match.file,
|
||||||
|
code: lines.slice(match.line - 1, match.line + 39).join("\n")
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Search Across All Fetched Sources
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const sources = opensrc.list();
|
||||||
|
const results = {};
|
||||||
|
|
||||||
|
for (const source of sources) {
|
||||||
|
const errorHandling = await opensrc.grep("throw new|catch \\(|\\.catch\\(", {
|
||||||
|
sources: [source.name],
|
||||||
|
include: "*.ts",
|
||||||
|
maxResults: 20
|
||||||
|
});
|
||||||
|
results[source.name] = {
|
||||||
|
type: source.type,
|
||||||
|
errorPatterns: errorHandling.length
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## AST-Based Search
|
||||||
|
|
||||||
|
Use `astGrep` for semantic code search with pattern matching.
|
||||||
|
|
||||||
|
### Find Function Declarations
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("lodash");
|
||||||
|
|
||||||
|
const fns = await opensrc.astGrep(source.name, "function $NAME($$$ARGS) { $$$BODY }", {
|
||||||
|
lang: "js",
|
||||||
|
limit: 20
|
||||||
|
});
|
||||||
|
|
||||||
|
return fns.map(m => ({
|
||||||
|
file: m.file,
|
||||||
|
line: m.line,
|
||||||
|
name: m.metavars.NAME
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Find React Hooks Usage
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("vercel/ai");
|
||||||
|
|
||||||
|
const stateHooks = await opensrc.astGrep(
|
||||||
|
source.name,
|
||||||
|
"const [$STATE, $SETTER] = useState($$$INIT)",
|
||||||
|
{ lang: ["ts", "tsx"], limit: 50 }
|
||||||
|
);
|
||||||
|
|
||||||
|
return stateHooks.map(m => ({
|
||||||
|
file: m.file,
|
||||||
|
state: m.metavars.STATE,
|
||||||
|
setter: m.metavars.SETTER
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Find Class Definitions with Context
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const [{ source }] = await opensrc.fetch("zod");
|
||||||
|
|
||||||
|
const classes = await opensrc.astGrep(source.name, "class $NAME", {
|
||||||
|
glob: "**/*.ts"
|
||||||
|
});
|
||||||
|
|
||||||
|
const details = [];
|
||||||
|
for (const cls of classes.slice(0, 5)) {
|
||||||
|
const content = await opensrc.read(source.name, cls.file);
|
||||||
|
const lines = content.split("\n");
|
||||||
|
details.push({
|
||||||
|
name: cls.metavars.NAME,
|
||||||
|
file: cls.file,
|
||||||
|
preview: lines.slice(cls.line - 1, cls.line + 9).join("\n")
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return details;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Compare Export Patterns Across Libraries
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const results = await opensrc.fetch(["zod", "valibot"]);
|
||||||
|
const names = results.map(r => r.source.name);
|
||||||
|
|
||||||
|
const exports = {};
|
||||||
|
for (const name of names) {
|
||||||
|
const matches = await opensrc.astGrep(name, "export const $NAME = $_", {
|
||||||
|
lang: "ts",
|
||||||
|
limit: 30
|
||||||
|
});
|
||||||
|
exports[name] = matches.map(m => m.metavars.NAME);
|
||||||
|
}
|
||||||
|
return exports;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### grep vs astGrep
|
||||||
|
|
||||||
|
| Use Case | Tool |
|
||||||
|
|----------|------|
|
||||||
|
| Text/regex pattern | `grep` |
|
||||||
|
| Function declarations | `astGrep`: `function $NAME($$$) { $$$ }` |
|
||||||
|
| Arrow functions | `astGrep`: `const $N = ($$$) => $_` |
|
||||||
|
| Class definitions | `astGrep`: `class $NAME extends $PARENT` |
|
||||||
|
| Import statements | `astGrep`: `import { $$$IMPORTS } from "$MOD"` |
|
||||||
|
| JSX components | `astGrep`: `<$COMP $$$PROPS />` |
|
||||||
|
|
||||||
|
## Repository Exploration
|
||||||
|
|
||||||
|
### Find Entry Points
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const name = "github.com/vercel/ai";
|
||||||
|
|
||||||
|
const allFiles = await opensrc.files(name, "**/*.{ts,js}");
|
||||||
|
const entryPoints = allFiles.filter(f =>
|
||||||
|
f.path.match(/^(src\/)?(index|main|mod)\.(ts|js)$/) ||
|
||||||
|
f.path.includes("/index.ts")
|
||||||
|
);
|
||||||
|
|
||||||
|
// Read all entry points
|
||||||
|
const contents = {};
|
||||||
|
for (const ep of entryPoints.slice(0, 5)) {
|
||||||
|
contents[ep.path] = await opensrc.read(name, ep.path);
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
totalFiles: allFiles.length,
|
||||||
|
entryPoints: entryPoints.map(f => f.path),
|
||||||
|
contents
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Explore Package Structure
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const name = "zod";
|
||||||
|
|
||||||
|
// Get all TypeScript files
|
||||||
|
const tsFiles = await opensrc.files(name, "**/*.ts");
|
||||||
|
|
||||||
|
// Group by directory
|
||||||
|
const byDir = {};
|
||||||
|
for (const f of tsFiles) {
|
||||||
|
const dir = f.path.split("/").slice(0, -1).join("/") || ".";
|
||||||
|
byDir[dir] = (byDir[dir] || 0) + 1;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read key files
|
||||||
|
const pkg = await opensrc.read(name, "package.json");
|
||||||
|
const readme = await opensrc.read(name, "README.md");
|
||||||
|
|
||||||
|
return {
|
||||||
|
structure: byDir,
|
||||||
|
package: JSON.parse(pkg),
|
||||||
|
readmePreview: readme.slice(0, 500)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Batch Operations
|
||||||
|
|
||||||
|
### Read Many with Error Handling
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const files = await opensrc.readMany("zod", [
|
||||||
|
"src/index.ts",
|
||||||
|
"src/types.ts",
|
||||||
|
"src/ZodError.ts",
|
||||||
|
"src/helpers/parseUtil.ts"
|
||||||
|
]);
|
||||||
|
|
||||||
|
// files is Record<string, string> - errors start with "[Error:"
|
||||||
|
const successful = Object.entries(files)
|
||||||
|
.filter(([_, content]) => !content.startsWith("[Error:"))
|
||||||
|
.map(([path, content]) => ({ path, lines: content.split("\n").length }));
|
||||||
|
|
||||||
|
return successful;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Parallel Grep Across Multiple Sources
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
async () => {
|
||||||
|
const targets = ["zod", "valibot"];
|
||||||
|
const pattern = "export (type|interface)";
|
||||||
|
|
||||||
|
const results = await Promise.all(
|
||||||
|
targets.map(async (name) => {
|
||||||
|
const matches = await opensrc.grep(pattern, {
|
||||||
|
sources: [name],
|
||||||
|
include: "*.ts",
|
||||||
|
maxResults: 50
|
||||||
|
});
|
||||||
|
return { name, count: matches.length, matches };
|
||||||
|
})
|
||||||
|
);
|
||||||
|
|
||||||
|
return results;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Workflow Checklist
|
||||||
|
|
||||||
|
### Comprehensive Repository Analysis
|
||||||
|
|
||||||
|
```
|
||||||
|
Repository Analysis Progress:
|
||||||
|
- [ ] 1. Fetch repository
|
||||||
|
- [ ] 2. Read package.json + README
|
||||||
|
- [ ] 3. Identify entry points (src/index.*)
|
||||||
|
- [ ] 4. Read main entry file
|
||||||
|
- [ ] 5. Map exports and public API
|
||||||
|
- [ ] 6. Trace key functionality
|
||||||
|
- [ ] 7. Create architecture diagram
|
||||||
|
```
|
||||||
|
|
||||||
|
### Library Comparison
|
||||||
|
|
||||||
|
```
|
||||||
|
Comparison Progress:
|
||||||
|
- [ ] 1. Fetch all libraries
|
||||||
|
- [ ] 2. Grep for target pattern in each
|
||||||
|
- [ ] 3. Read matching implementations
|
||||||
|
- [ ] 4. Create comparison table
|
||||||
|
- [ ] 5. Synthesize findings
|
||||||
|
```
|
||||||
109
profiles/opencode/skill/librarian/references/tool-routing.md
Normal file
109
profiles/opencode/skill/librarian/references/tool-routing.md
Normal file
@@ -0,0 +1,109 @@
|
|||||||
|
# Tool Routing
|
||||||
|
|
||||||
|
## Decision Flowchart
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
Q[User Query] --> T{Query Type?}
|
||||||
|
T -->|Understand/Explain| U[UNDERSTAND]
|
||||||
|
T -->|Find/Search| F[FIND]
|
||||||
|
T -->|Explore/Architecture| E[EXPLORE]
|
||||||
|
T -->|Compare| C[COMPARE]
|
||||||
|
|
||||||
|
U --> U1{Known library?}
|
||||||
|
U1 -->|Yes| U2[context7.resolve-library-id]
|
||||||
|
U2 --> U3[context7.query-docs]
|
||||||
|
U3 --> U4{Need source?}
|
||||||
|
U4 -->|Yes| U5[opensrc.fetch → read]
|
||||||
|
U1 -->|No| U6[grep_app → opensrc.fetch]
|
||||||
|
|
||||||
|
F --> F1{Specific repo?}
|
||||||
|
F1 -->|Yes| F2[opensrc.fetch → grep → read]
|
||||||
|
F1 -->|No| F3[grep_app broad search]
|
||||||
|
F3 --> F4[opensrc.fetch interesting repos]
|
||||||
|
|
||||||
|
E --> E1[opensrc.fetch]
|
||||||
|
E1 --> E2[opensrc.files]
|
||||||
|
E2 --> E3[Read entry points]
|
||||||
|
E3 --> E4[Create diagram]
|
||||||
|
|
||||||
|
C --> C1["opensrc.fetch([X, Y])"]
|
||||||
|
C1 --> C2[grep same pattern]
|
||||||
|
C2 --> C3[Read comparable files]
|
||||||
|
C3 --> C4[Synthesize comparison]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Query Type Detection
|
||||||
|
|
||||||
|
| Keywords | Query Type | Start With |
|
||||||
|
|----------|------------|------------|
|
||||||
|
| "how does", "why does", "explain", "purpose of" | UNDERSTAND | context7 |
|
||||||
|
| "find", "where is", "implementations of", "examples of" | FIND | grep_app |
|
||||||
|
| "explore", "walk through", "architecture", "structure" | EXPLORE | opensrc |
|
||||||
|
| "compare", "vs", "difference between" | COMPARE | opensrc |
|
||||||
|
|
||||||
|
## UNDERSTAND Queries
|
||||||
|
|
||||||
|
```
|
||||||
|
Known library? → context7.resolve-library-id → context7.query-docs
|
||||||
|
└─ Need source? → opensrc.fetch → read
|
||||||
|
|
||||||
|
Unknown? → grep_app search → opensrc.fetch top result → read
|
||||||
|
```
|
||||||
|
|
||||||
|
**When to transition context7 → opensrc:**
|
||||||
|
- Need implementation details (not just API docs)
|
||||||
|
- Question about internals/private methods
|
||||||
|
- Tracing code flow through library
|
||||||
|
|
||||||
|
## FIND Queries
|
||||||
|
|
||||||
|
```
|
||||||
|
Specific repo? → opensrc.fetch → opensrc.grep → read matches
|
||||||
|
|
||||||
|
Broad search? → grep_app → analyze → opensrc.fetch interesting repos
|
||||||
|
```
|
||||||
|
|
||||||
|
**grep_app query tips:**
|
||||||
|
- Use literal code patterns: `useState(` not "react hooks"
|
||||||
|
- Filter by language: `language: ["TypeScript"]`
|
||||||
|
- Narrow by repo: `repo: "vercel/"` for org
|
||||||
|
|
||||||
|
## EXPLORE Queries
|
||||||
|
|
||||||
|
```
|
||||||
|
1. opensrc.fetch(target)
|
||||||
|
2. opensrc.files → understand structure
|
||||||
|
3. Identify entry points: README, package.json, src/index.*
|
||||||
|
4. Read entry → internals
|
||||||
|
5. Create architecture diagram
|
||||||
|
```
|
||||||
|
|
||||||
|
## COMPARE Queries
|
||||||
|
|
||||||
|
```
|
||||||
|
1. opensrc.fetch([X, Y])
|
||||||
|
2. Extract source.name from each result
|
||||||
|
3. opensrc.grep same pattern in both
|
||||||
|
4. Read comparable files
|
||||||
|
5. Synthesize → comparison table
|
||||||
|
```
|
||||||
|
|
||||||
|
## Tool Capabilities
|
||||||
|
|
||||||
|
| Tool | Best For | Not For |
|
||||||
|
|------|----------|---------|
|
||||||
|
| **grep_app** | Broad search, unknown scope, finding repos | Semantic queries |
|
||||||
|
| **context7** | Library APIs, best practices, common patterns | Library internals |
|
||||||
|
| **opensrc** | Deep exploration, reading internals, tracing flow | Initial discovery |
|
||||||
|
|
||||||
|
## Anti-patterns
|
||||||
|
|
||||||
|
| Don't | Do |
|
||||||
|
|-------|-----|
|
||||||
|
| grep_app for known library docs | context7 first |
|
||||||
|
| opensrc.fetch before knowing target | grep_app to discover |
|
||||||
|
| Multiple small reads | opensrc.readMany batch |
|
||||||
|
| Describe without linking | Link every file ref |
|
||||||
|
| Text for complex relationships | Mermaid diagram |
|
||||||
|
| Use tool names in responses | "I'll search..." not "I'll use opensrc" |
|
||||||
110
profiles/opencode/skill/overseer-plan/SKILL.md
Normal file
110
profiles/opencode/skill/overseer-plan/SKILL.md
Normal file
@@ -0,0 +1,110 @@
|
|||||||
|
---
|
||||||
|
name: overseer-plan
|
||||||
|
description: Convert markdown planning documents to Overseer tasks via MCP codemode. Use when converting plans, specs, or design docs to trackable task hierarchies.
|
||||||
|
license: MIT
|
||||||
|
metadata:
|
||||||
|
author: dmmulroy
|
||||||
|
version: "1.0.0"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Converting Markdown Documents to Overseer Tasks
|
||||||
|
|
||||||
|
Use `/overseer-plan` to convert any markdown planning document into trackable Overseer tasks.
|
||||||
|
|
||||||
|
## When to Use
|
||||||
|
|
||||||
|
- After completing a plan in plan mode
|
||||||
|
- Converting specs/design docs to implementation tasks
|
||||||
|
- Creating tasks from roadmap or milestone documents
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```
|
||||||
|
/overseer-plan <markdown-file-path>
|
||||||
|
/overseer-plan <file> --priority 3 # Set priority (1-5)
|
||||||
|
/overseer-plan <file> --parent <task-id> # Create as child of existing task
|
||||||
|
```
|
||||||
|
|
||||||
|
## What It Does
|
||||||
|
|
||||||
|
1. Reads markdown file
|
||||||
|
2. Extracts title from first `#` heading (strips "Plan: " prefix)
|
||||||
|
3. Creates Overseer milestone (or child task if `--parent` provided)
|
||||||
|
4. Analyzes structure for child task breakdown
|
||||||
|
5. Creates child tasks (depth 1) or subtasks (depth 2) when appropriate
|
||||||
|
6. Returns task ID and breakdown summary
|
||||||
|
|
||||||
|
## Hierarchy Levels
|
||||||
|
|
||||||
|
| Depth | Name | Example |
|
||||||
|
|-------|------|---------|
|
||||||
|
| 0 | **Milestone** | "Add user authentication system" |
|
||||||
|
| 1 | **Task** | "Implement JWT middleware" |
|
||||||
|
| 2 | **Subtask** | "Add token verification function" |
|
||||||
|
|
||||||
|
## Breakdown Decision
|
||||||
|
|
||||||
|
**Create subtasks when:**
|
||||||
|
- 3-7 clearly separable work items
|
||||||
|
- Implementation across multiple files/components
|
||||||
|
- Clear sequential dependencies
|
||||||
|
|
||||||
|
**Keep single milestone when:**
|
||||||
|
- 1-2 steps only
|
||||||
|
- Work items tightly coupled
|
||||||
|
- Plan is exploratory/investigative
|
||||||
|
|
||||||
|
## Task Quality Criteria
|
||||||
|
|
||||||
|
Every task must be:
|
||||||
|
- **Atomic**: Single committable unit of work
|
||||||
|
- **Validated**: Has tests OR explicit acceptance criteria in context ("Done when: ...")
|
||||||
|
- **Clear**: Technical, specific, imperative verb
|
||||||
|
|
||||||
|
Every milestone must:
|
||||||
|
- **Demoable**: Produces runnable/testable increment
|
||||||
|
- **Builds on prior**: Can depend on previous milestone's output
|
||||||
|
|
||||||
|
## Review Workflow
|
||||||
|
|
||||||
|
1. Analyze document -> propose breakdown
|
||||||
|
2. **Invoke Oracle** to review breakdown and suggest improvements
|
||||||
|
3. Incorporate feedback
|
||||||
|
4. Create in Overseer (persists to SQLite via MCP)
|
||||||
|
|
||||||
|
## After Creating
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.get("<id>"); // TaskWithContext (full context + learnings)
|
||||||
|
await tasks.list({ parentId: "<id>" }); // Task[] (children without context chain)
|
||||||
|
await tasks.start("<id>"); // Task (VCS required - creates bookmark, records start commit)
|
||||||
|
await tasks.complete("<id>", { result: "...", learnings: [...] }); // Task (VCS required - commits, bubbles learnings)
|
||||||
|
```
|
||||||
|
|
||||||
|
**VCS Required**: `start` and `complete` require jj or git (fail with `NotARepository` if none found). CRUD operations work without VCS.
|
||||||
|
|
||||||
|
**Note**: Priority must be 1-5. Blockers cannot be ancestors or descendants.
|
||||||
|
|
||||||
|
## When NOT to Use
|
||||||
|
|
||||||
|
- Document incomplete or exploratory
|
||||||
|
- Content not actionable
|
||||||
|
- No meaningful planning content
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Reading Order
|
||||||
|
|
||||||
|
| Task | File |
|
||||||
|
|------|------|
|
||||||
|
| Understanding API | @file references/api.md |
|
||||||
|
| Agent implementation | @file references/implementation.md |
|
||||||
|
| See examples | @file references/examples.md |
|
||||||
|
|
||||||
|
## In This Reference
|
||||||
|
|
||||||
|
| File | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `references/api.md` | Overseer MCP codemode API types/methods |
|
||||||
|
| `references/implementation.md` | Step-by-step execution instructions for agent |
|
||||||
|
| `references/examples.md` | Complete worked examples |
|
||||||
192
profiles/opencode/skill/overseer-plan/references/api.md
Normal file
192
profiles/opencode/skill/overseer-plan/references/api.md
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
# Overseer Codemode MCP API
|
||||||
|
|
||||||
|
Execute JavaScript code to interact with Overseer task management.
|
||||||
|
|
||||||
|
## Task Interfaces
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Basic task - returned by list(), create(), start(), complete()
|
||||||
|
// Note: Does NOT include context or learnings fields
|
||||||
|
interface Task {
|
||||||
|
id: string;
|
||||||
|
parentId: string | null;
|
||||||
|
description: string;
|
||||||
|
priority: 1 | 2 | 3 | 4 | 5;
|
||||||
|
completed: boolean;
|
||||||
|
completedAt: string | null;
|
||||||
|
startedAt: string | null;
|
||||||
|
createdAt: string; // ISO 8601
|
||||||
|
updatedAt: string;
|
||||||
|
result: string | null; // Completion notes
|
||||||
|
commitSha: string | null; // Auto-populated on complete
|
||||||
|
depth: 0 | 1 | 2; // 0=milestone, 1=task, 2=subtask
|
||||||
|
blockedBy?: string[]; // Blocking task IDs (omitted if empty)
|
||||||
|
blocks?: string[]; // Tasks this blocks (omitted if empty)
|
||||||
|
bookmark?: string; // VCS bookmark name (if started)
|
||||||
|
startCommit?: string; // Commit SHA at start
|
||||||
|
effectivelyBlocked: boolean; // True if task OR ancestor has incomplete blockers
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task with full context - returned by get(), nextReady()
|
||||||
|
interface TaskWithContext extends Task {
|
||||||
|
context: {
|
||||||
|
own: string; // This task's context
|
||||||
|
parent?: string; // Parent's context (depth > 0)
|
||||||
|
milestone?: string; // Root milestone's context (depth > 1)
|
||||||
|
};
|
||||||
|
learnings: {
|
||||||
|
own: Learning[]; // This task's learnings (bubbled from completed children)
|
||||||
|
parent: Learning[]; // Parent's learnings (depth > 0)
|
||||||
|
milestone: Learning[]; // Milestone's learnings (depth > 1)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task tree structure - returned by tree()
|
||||||
|
interface TaskTree {
|
||||||
|
task: Task;
|
||||||
|
children: TaskTree[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Progress summary - returned by progress()
|
||||||
|
interface TaskProgress {
|
||||||
|
total: number;
|
||||||
|
completed: number;
|
||||||
|
ready: number; // !completed && !effectivelyBlocked
|
||||||
|
blocked: number; // !completed && effectivelyBlocked
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task type alias for depth filter
|
||||||
|
type TaskType = "milestone" | "task" | "subtask";
|
||||||
|
```
|
||||||
|
|
||||||
|
## Learning Interface
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface Learning {
|
||||||
|
id: string;
|
||||||
|
taskId: string;
|
||||||
|
content: string;
|
||||||
|
sourceTaskId: string | null;
|
||||||
|
createdAt: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Tasks API
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
declare const tasks: {
|
||||||
|
list(filter?: {
|
||||||
|
parentId?: string;
|
||||||
|
ready?: boolean;
|
||||||
|
completed?: boolean;
|
||||||
|
depth?: 0 | 1 | 2; // 0=milestones, 1=tasks, 2=subtasks
|
||||||
|
type?: TaskType; // Alias: "milestone"|"task"|"subtask" (mutually exclusive with depth)
|
||||||
|
}): Promise<Task[]>;
|
||||||
|
get(id: string): Promise<TaskWithContext>;
|
||||||
|
create(input: {
|
||||||
|
description: string;
|
||||||
|
context?: string;
|
||||||
|
parentId?: string;
|
||||||
|
priority?: 1 | 2 | 3 | 4 | 5; // Must be 1-5
|
||||||
|
blockedBy?: string[]; // Cannot be ancestors/descendants
|
||||||
|
}): Promise<Task>;
|
||||||
|
update(id: string, input: {
|
||||||
|
description?: string;
|
||||||
|
context?: string;
|
||||||
|
priority?: 1 | 2 | 3 | 4 | 5;
|
||||||
|
parentId?: string;
|
||||||
|
}): Promise<Task>;
|
||||||
|
start(id: string): Promise<Task>;
|
||||||
|
complete(id: string, input?: { result?: string; learnings?: string[] }): Promise<Task>;
|
||||||
|
reopen(id: string): Promise<Task>;
|
||||||
|
delete(id: string): Promise<void>;
|
||||||
|
block(taskId: string, blockerId: string): Promise<void>;
|
||||||
|
unblock(taskId: string, blockerId: string): Promise<void>;
|
||||||
|
nextReady(milestoneId?: string): Promise<TaskWithContext | null>;
|
||||||
|
tree(rootId?: string): Promise<TaskTree | TaskTree[]>;
|
||||||
|
search(query: string): Promise<Task[]>;
|
||||||
|
progress(rootId?: string): Promise<TaskProgress>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
| Method | Returns | Description |
|
||||||
|
|--------|---------|-------------|
|
||||||
|
| `list` | `Task[]` | Filter by `parentId`, `ready`, `completed`, `depth`, `type` |
|
||||||
|
| `get` | `TaskWithContext` | Get task with full context chain + inherited learnings |
|
||||||
|
| `create` | `Task` | Create task (priority must be 1-5) |
|
||||||
|
| `update` | `Task` | Update description, context, priority, parentId |
|
||||||
|
| `start` | `Task` | **VCS required** - creates bookmark, records start commit |
|
||||||
|
| `complete` | `Task` | **VCS required** - commits changes + bubbles learnings to parent |
|
||||||
|
| `reopen` | `Task` | Reopen completed task |
|
||||||
|
| `delete` | `void` | Delete task + best-effort VCS bookmark cleanup |
|
||||||
|
| `block` | `void` | Add blocker (cannot be self, ancestor, or descendant) |
|
||||||
|
| `unblock` | `void` | Remove blocker relationship |
|
||||||
|
| `nextReady` | `TaskWithContext \| null` | Get deepest ready leaf with full context |
|
||||||
|
| `tree` | `TaskTree \| TaskTree[]` | Get task tree (all milestones if no ID) |
|
||||||
|
| `search` | `Task[]` | Search by description/context/result (case-insensitive) |
|
||||||
|
| `progress` | `TaskProgress` | Aggregate counts for milestone or all tasks |
|
||||||
|
|
||||||
|
## Learnings API
|
||||||
|
|
||||||
|
Learnings are added via `tasks.complete(id, { learnings: [...] })` and bubble to immediate parent (preserving `sourceTaskId`).
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
declare const learnings: {
|
||||||
|
list(taskId: string): Promise<Learning[]>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
| Method | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| `list` | List learnings for task |
|
||||||
|
|
||||||
|
## VCS Integration (Required for Workflow)
|
||||||
|
|
||||||
|
VCS operations are **automatically handled** by the tasks API:
|
||||||
|
|
||||||
|
| Task Operation | VCS Effect |
|
||||||
|
|----------------|------------|
|
||||||
|
| `tasks.start(id)` | **VCS required** - creates bookmark `task/<id>`, records start commit |
|
||||||
|
| `tasks.complete(id)` | **VCS required** - commits changes (NothingToCommit = success) |
|
||||||
|
| `tasks.delete(id)` | Best-effort bookmark cleanup (logs warning on failure) |
|
||||||
|
|
||||||
|
**VCS (jj or git) is required** for start/complete. Fails with `NotARepository` if none found. CRUD operations work without VCS.
|
||||||
|
|
||||||
|
## Quick Examples
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create milestone with subtask
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Build authentication system",
|
||||||
|
context: "JWT-based auth with refresh tokens",
|
||||||
|
priority: 1
|
||||||
|
});
|
||||||
|
|
||||||
|
const subtask = await tasks.create({
|
||||||
|
description: "Implement token refresh logic",
|
||||||
|
parentId: milestone.id,
|
||||||
|
context: "Handle 7-day expiry"
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start work (VCS required - creates bookmark)
|
||||||
|
await tasks.start(subtask.id);
|
||||||
|
|
||||||
|
// ... do implementation work ...
|
||||||
|
|
||||||
|
// Complete task with learnings (VCS required - commits changes, bubbles learnings to parent)
|
||||||
|
await tasks.complete(subtask.id, {
|
||||||
|
result: "Implemented using jose library",
|
||||||
|
learnings: ["Use jose instead of jsonwebtoken"]
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get progress summary
|
||||||
|
const progress = await tasks.progress(milestone.id);
|
||||||
|
// -> { total: 2, completed: 1, ready: 1, blocked: 0 }
|
||||||
|
|
||||||
|
// Search tasks
|
||||||
|
const authTasks = await tasks.search("authentication");
|
||||||
|
|
||||||
|
// Get task tree
|
||||||
|
const tree = await tasks.tree(milestone.id);
|
||||||
|
// -> { task: Task, children: TaskTree[] }
|
||||||
|
```
|
||||||
177
profiles/opencode/skill/overseer-plan/references/examples.md
Normal file
177
profiles/opencode/skill/overseer-plan/references/examples.md
Normal file
@@ -0,0 +1,177 @@
|
|||||||
|
# Examples
|
||||||
|
|
||||||
|
## Example 1: With Breakdown
|
||||||
|
|
||||||
|
### Input (`auth-plan.md`)
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Plan: Add Authentication System
|
||||||
|
|
||||||
|
## Implementation
|
||||||
|
1. Create database schema for users/tokens
|
||||||
|
2. Implement auth controller with endpoints
|
||||||
|
3. Add JWT middleware for route protection
|
||||||
|
4. Build frontend login/register forms
|
||||||
|
5. Add integration tests
|
||||||
|
```
|
||||||
|
|
||||||
|
### Execution
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Add Authentication System",
|
||||||
|
context: `# Add Authentication System\n\n## Implementation\n1. Create database schema...`,
|
||||||
|
priority: 3
|
||||||
|
});
|
||||||
|
|
||||||
|
const subtasks = [
|
||||||
|
{ desc: "Create database schema for users/tokens", done: "Migration runs, tables exist with FK constraints" },
|
||||||
|
{ desc: "Implement auth controller with endpoints", done: "POST /register, /login return expected responses" },
|
||||||
|
{ desc: "Add JWT middleware for route protection", done: "Unauthorized requests return 401, valid tokens pass" },
|
||||||
|
{ desc: "Build frontend login/register forms", done: "Forms render, submit without errors" },
|
||||||
|
{ desc: "Add integration tests", done: "`npm test` passes with auth coverage" }
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const sub of subtasks) {
|
||||||
|
await tasks.create({
|
||||||
|
description: sub.desc,
|
||||||
|
context: `Part of 'Add Authentication System'.\n\nDone when: ${sub.done}`,
|
||||||
|
parentId: milestone.id
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return { milestone: milestone.id, subtaskCount: subtasks.length };
|
||||||
|
```
|
||||||
|
|
||||||
|
### Output
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone task_01ABC from plan
|
||||||
|
|
||||||
|
Analyzed plan structure: Found 5 distinct implementation steps
|
||||||
|
Created 5 subtasks:
|
||||||
|
- task_02XYZ: Create database schema for users/tokens
|
||||||
|
- task_03ABC: Implement auth controller with endpoints
|
||||||
|
- task_04DEF: Add JWT middleware for route protection
|
||||||
|
- task_05GHI: Build frontend login/register forms
|
||||||
|
- task_06JKL: Add integration tests
|
||||||
|
|
||||||
|
View structure: execute `await tasks.list({ parentId: "task_01ABC" })`
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example 2: No Breakdown
|
||||||
|
|
||||||
|
### Input (`bugfix-plan.md`)
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Plan: Fix Login Validation Bug
|
||||||
|
|
||||||
|
## Problem
|
||||||
|
Login fails when username has spaces
|
||||||
|
|
||||||
|
## Solution
|
||||||
|
Update validation regex in auth.ts line 42
|
||||||
|
```
|
||||||
|
|
||||||
|
### Execution
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Fix Login Validation Bug",
|
||||||
|
context: `# Fix Login Validation Bug\n\n## Problem\nLogin fails...`,
|
||||||
|
priority: 3
|
||||||
|
});
|
||||||
|
|
||||||
|
return { milestone: milestone.id, breakdown: false };
|
||||||
|
```
|
||||||
|
|
||||||
|
### Output
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone task_01ABC from plan
|
||||||
|
|
||||||
|
Plan describes a cohesive single task. No subtask breakdown needed.
|
||||||
|
|
||||||
|
View task: execute `await tasks.get("task_01ABC")`
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example 3: Epic-Level (Two-Level Hierarchy)
|
||||||
|
|
||||||
|
### Input (`full-auth-plan.md`)
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
# Complete User Authentication System
|
||||||
|
|
||||||
|
## Phase 1: Backend Infrastructure
|
||||||
|
1. Database schema for users/sessions
|
||||||
|
2. Password hashing with bcrypt
|
||||||
|
3. JWT token generation
|
||||||
|
|
||||||
|
## Phase 2: API Endpoints
|
||||||
|
1. POST /auth/register
|
||||||
|
2. POST /auth/login
|
||||||
|
3. POST /auth/logout
|
||||||
|
|
||||||
|
## Phase 3: Frontend
|
||||||
|
1. Login/register forms
|
||||||
|
2. Protected routes
|
||||||
|
3. Session persistence
|
||||||
|
```
|
||||||
|
|
||||||
|
### Execution
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Complete User Authentication System",
|
||||||
|
context: `<full-markdown>`,
|
||||||
|
priority: 3
|
||||||
|
});
|
||||||
|
|
||||||
|
const phases = [
|
||||||
|
{ name: "Backend Infrastructure", items: [
|
||||||
|
{ desc: "Database schema", done: "Migration runs, tables exist" },
|
||||||
|
{ desc: "Password hashing", done: "bcrypt hashes verified in tests" },
|
||||||
|
{ desc: "JWT tokens", done: "Token generation/validation works" }
|
||||||
|
]},
|
||||||
|
{ name: "API Endpoints", items: [
|
||||||
|
{ desc: "POST /auth/register", done: "Creates user, returns 201" },
|
||||||
|
{ desc: "POST /auth/login", done: "Returns JWT on valid credentials" },
|
||||||
|
{ desc: "POST /auth/logout", done: "Invalidates session, returns 200" }
|
||||||
|
]},
|
||||||
|
{ name: "Frontend", items: [
|
||||||
|
{ desc: "Login/register forms", done: "Forms render, submit successfully" },
|
||||||
|
{ desc: "Protected routes", done: "Redirect to login when unauthenticated" },
|
||||||
|
{ desc: "Session persistence", done: "Refresh maintains logged-in state" }
|
||||||
|
]}
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const phase of phases) {
|
||||||
|
const phaseTask = await tasks.create({
|
||||||
|
description: phase.name,
|
||||||
|
parentId: milestone.id
|
||||||
|
});
|
||||||
|
for (const item of phase.items) {
|
||||||
|
await tasks.create({
|
||||||
|
description: item.desc,
|
||||||
|
context: `Part of '${phase.name}'.\n\nDone when: ${item.done}`,
|
||||||
|
parentId: phaseTask.id
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return milestone;
|
||||||
|
```
|
||||||
|
|
||||||
|
### Output
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone task_01ABC from plan
|
||||||
|
|
||||||
|
Analyzed plan structure: Found 3 major phases
|
||||||
|
Created as milestone with 3 tasks:
|
||||||
|
- task_02XYZ: Backend Infrastructure (3 subtasks)
|
||||||
|
- task_03ABC: API Endpoints (3 subtasks)
|
||||||
|
- task_04DEF: Frontend (3 subtasks)
|
||||||
|
|
||||||
|
View structure: execute `await tasks.list({ parentId: "task_01ABC" })`
|
||||||
|
```
|
||||||
@@ -0,0 +1,210 @@
|
|||||||
|
# Implementation Instructions
|
||||||
|
|
||||||
|
**For the skill agent executing `/overseer-plan`.** Follow this workflow exactly.
|
||||||
|
|
||||||
|
## Step 1: Read Markdown File
|
||||||
|
|
||||||
|
Read the provided file using the Read tool.
|
||||||
|
|
||||||
|
## Step 2: Extract Title
|
||||||
|
|
||||||
|
- Parse first `#` heading as title
|
||||||
|
- Strip "Plan: " prefix if present (case-insensitive)
|
||||||
|
- Fallback: use filename without extension
|
||||||
|
|
||||||
|
## Step 3: Create Milestone via MCP
|
||||||
|
|
||||||
|
Basic creation:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "<extracted-title>",
|
||||||
|
context: `<full-markdown-content>`,
|
||||||
|
priority: <priority-if-provided-else-3>
|
||||||
|
});
|
||||||
|
return milestone;
|
||||||
|
```
|
||||||
|
|
||||||
|
With `--parent` option:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.create({
|
||||||
|
description: "<extracted-title>",
|
||||||
|
context: `<full-markdown-content>`,
|
||||||
|
parentId: "<parent-id>",
|
||||||
|
priority: <priority-if-provided-else-3>
|
||||||
|
});
|
||||||
|
return task;
|
||||||
|
```
|
||||||
|
|
||||||
|
Capture returned task ID for subsequent steps.
|
||||||
|
|
||||||
|
## Step 4: Analyze Plan Structure
|
||||||
|
|
||||||
|
### Breakdown Indicators
|
||||||
|
|
||||||
|
1. **Numbered/bulleted implementation lists (3-7 items)**
|
||||||
|
```markdown
|
||||||
|
## Implementation
|
||||||
|
1. Create database schema
|
||||||
|
2. Build API endpoints
|
||||||
|
3. Add frontend components
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Clear subsections under implementation/tasks/steps**
|
||||||
|
```markdown
|
||||||
|
### 1. Backend Changes
|
||||||
|
- Modify server.ts
|
||||||
|
|
||||||
|
### 2. Frontend Updates
|
||||||
|
- Update login form
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **File-specific sections**
|
||||||
|
```markdown
|
||||||
|
### `src/auth.ts` - Add JWT validation
|
||||||
|
### `src/middleware.ts` - Create auth middleware
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Sequential phases**
|
||||||
|
```markdown
|
||||||
|
**Phase 1: Database Layer**
|
||||||
|
**Phase 2: API Layer**
|
||||||
|
```
|
||||||
|
|
||||||
|
### Do NOT Break Down When
|
||||||
|
|
||||||
|
- Only 1-2 steps/items
|
||||||
|
- Plan is a single cohesive fix
|
||||||
|
- Content is exploratory ("investigate", "research")
|
||||||
|
- Work items inseparable
|
||||||
|
- Plan very short (<10 lines)
|
||||||
|
|
||||||
|
## Step 5: Validate Atomicity & Acceptance Criteria
|
||||||
|
|
||||||
|
For each proposed task, verify:
|
||||||
|
- **Atomic**: Can be completed in single commit
|
||||||
|
- **Validated**: Has clear acceptance criteria
|
||||||
|
|
||||||
|
If task too large -> split further.
|
||||||
|
If no validation -> add to context:
|
||||||
|
|
||||||
|
```
|
||||||
|
Done when: <specific observable criteria>
|
||||||
|
```
|
||||||
|
|
||||||
|
Examples of good acceptance criteria:
|
||||||
|
- "Done when: `npm test` passes, new migration applied"
|
||||||
|
- "Done when: API returns 200 with expected payload"
|
||||||
|
- "Done when: Component renders without console errors"
|
||||||
|
- "Done when: Type check passes (`tsc --noEmit`)"
|
||||||
|
|
||||||
|
## Step 6: Oracle Review
|
||||||
|
|
||||||
|
Before creating tasks, invoke Oracle to review the proposed breakdown.
|
||||||
|
|
||||||
|
**Prompt Oracle with:**
|
||||||
|
|
||||||
|
```
|
||||||
|
Review this task breakdown for "<milestone>":
|
||||||
|
|
||||||
|
1. <task> - Done when: <criteria>
|
||||||
|
2. <task> - Done when: <criteria>
|
||||||
|
...
|
||||||
|
|
||||||
|
Check:
|
||||||
|
- Are tasks truly atomic (single commit)?
|
||||||
|
- Is validation criteria clear and observable?
|
||||||
|
- Does milestone deliver demoable increment?
|
||||||
|
- Missing dependencies/blockers?
|
||||||
|
- Any tasks that should be split or merged?
|
||||||
|
```
|
||||||
|
|
||||||
|
Incorporate Oracle's feedback, then proceed to create tasks.
|
||||||
|
|
||||||
|
## Step 7: Create Subtasks (If Breaking Down)
|
||||||
|
|
||||||
|
### Extract for Each Subtask
|
||||||
|
|
||||||
|
1. **Description**: Strip numbering, keep concise (1-10 words), imperative form
|
||||||
|
2. **Context**: Section content + "Part of [milestone description]" + acceptance criteria
|
||||||
|
|
||||||
|
### Flat Breakdown
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const subtasks = [
|
||||||
|
{ description: "Create database schema", context: "Schema for users/tokens. Part of 'Add Auth'.\n\nDone when: Migration runs, tables exist with FK constraints." },
|
||||||
|
{ description: "Build API endpoints", context: "POST /auth/register, /auth/login. Part of 'Add Auth'.\n\nDone when: Endpoints return expected responses, tests pass." }
|
||||||
|
];
|
||||||
|
|
||||||
|
const created = [];
|
||||||
|
for (const sub of subtasks) {
|
||||||
|
const task = await tasks.create({
|
||||||
|
description: sub.description,
|
||||||
|
context: sub.context,
|
||||||
|
parentId: milestone.id
|
||||||
|
});
|
||||||
|
created.push(task);
|
||||||
|
}
|
||||||
|
return { milestone: milestone.id, subtasks: created };
|
||||||
|
```
|
||||||
|
|
||||||
|
### Epic-Level Breakdown (phases with sub-items)
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create phase as task under milestone
|
||||||
|
const phase = await tasks.create({
|
||||||
|
description: "Backend Infrastructure",
|
||||||
|
context: "Phase 1 context...",
|
||||||
|
parentId: milestoneId
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create subtasks under phase
|
||||||
|
for (const item of phaseItems) {
|
||||||
|
await tasks.create({
|
||||||
|
description: item.description,
|
||||||
|
context: item.context,
|
||||||
|
parentId: phase.id
|
||||||
|
});
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Step 8: Report Results
|
||||||
|
|
||||||
|
### Subtasks Created
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone <id> from plan
|
||||||
|
|
||||||
|
Analyzed plan structure: Found <N> distinct implementation steps
|
||||||
|
Created <N> subtasks:
|
||||||
|
- <id>: <description>
|
||||||
|
- <id>: <description>
|
||||||
|
...
|
||||||
|
|
||||||
|
View structure: execute `await tasks.list({ parentId: "<id>" })`
|
||||||
|
```
|
||||||
|
|
||||||
|
### No Breakdown
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone <id> from plan
|
||||||
|
|
||||||
|
Plan describes a cohesive single task. No subtask breakdown needed.
|
||||||
|
|
||||||
|
View task: execute `await tasks.get("<id>")`
|
||||||
|
```
|
||||||
|
|
||||||
|
### Epic-Level Breakdown
|
||||||
|
|
||||||
|
```
|
||||||
|
Created milestone <id> from plan
|
||||||
|
|
||||||
|
Analyzed plan structure: Found <N> major phases
|
||||||
|
Created as milestone with <N> tasks:
|
||||||
|
- <id>: <phase-name> (<M> subtasks)
|
||||||
|
- <id>: <phase-name> (<M> subtasks)
|
||||||
|
...
|
||||||
|
|
||||||
|
View structure: execute `await tasks.list({ parentId: "<id>" })`
|
||||||
|
```
|
||||||
191
profiles/opencode/skill/overseer/SKILL.md
Normal file
191
profiles/opencode/skill/overseer/SKILL.md
Normal file
@@ -0,0 +1,191 @@
|
|||||||
|
---
|
||||||
|
name: overseer
|
||||||
|
description: Manage tasks via Overseer codemode MCP. Use when tracking multi-session work, breaking down implementation, or persisting context for handoffs.
|
||||||
|
license: MIT
|
||||||
|
metadata:
|
||||||
|
author: dmmulroy
|
||||||
|
version: "1.0.0"
|
||||||
|
---
|
||||||
|
|
||||||
|
# Agent Coordination with Overseer
|
||||||
|
|
||||||
|
## Core Principle: Tickets, Not Todos
|
||||||
|
|
||||||
|
Overseer tasks are **tickets** - structured artifacts with comprehensive context:
|
||||||
|
|
||||||
|
- **Description**: One-line summary (issue title)
|
||||||
|
- **Context**: Full background, requirements, approach (issue body)
|
||||||
|
- **Result**: Implementation details, decisions, outcomes (PR description)
|
||||||
|
|
||||||
|
Think: "Would someone understand the what, why, and how from this task alone AND what success looks like?"
|
||||||
|
|
||||||
|
## Task IDs are Ephemeral
|
||||||
|
|
||||||
|
**Never reference task IDs in external artifacts** (commits, PRs, docs). Task IDs like `task_01JQAZ...` become meaningless once tasks complete. Describe the work itself, not the task that tracked it.
|
||||||
|
|
||||||
|
## Overseer vs OpenCode's TodoWrite
|
||||||
|
|
||||||
|
| | Overseer | TodoWrite |
|
||||||
|
| --------------- | ------------------------------------- | ---------------------- |
|
||||||
|
| **Persistence** | SQLite database | Session-only |
|
||||||
|
| **Context** | Rich (description + context + result) | Basic |
|
||||||
|
| **Hierarchy** | 3-level (milestone -> task -> subtask)| Flat |
|
||||||
|
|
||||||
|
Use **Overseer** for persistent work. Use **TodoWrite** for ephemeral in-session tracking only.
|
||||||
|
|
||||||
|
## When to Use Overseer
|
||||||
|
|
||||||
|
**Use Overseer when:**
|
||||||
|
- Breaking down complexity into subtasks
|
||||||
|
- Work spans multiple sessions
|
||||||
|
- Context needs to persist for handoffs
|
||||||
|
- Recording decisions for future reference
|
||||||
|
|
||||||
|
**Skip Overseer when:**
|
||||||
|
- Work is a single atomic action
|
||||||
|
- Everything fits in one message exchange
|
||||||
|
- Overhead exceeds value
|
||||||
|
- TodoWrite is sufficient
|
||||||
|
|
||||||
|
## Finding Work
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Get next ready task with full context (recommended for work sessions)
|
||||||
|
const task = await tasks.nextReady(milestoneId); // TaskWithContext | null
|
||||||
|
if (!task) {
|
||||||
|
console.log("No ready tasks");
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get all ready tasks (for progress overview)
|
||||||
|
const readyTasks = await tasks.list({ ready: true }); // Task[]
|
||||||
|
```
|
||||||
|
|
||||||
|
**Use `nextReady()`** when starting work - returns `TaskWithContext | null` (deepest ready leaf with full context chain + inherited learnings).
|
||||||
|
**Use `list({ ready: true })`** for status/progress checks - returns `Task[]` without context chain.
|
||||||
|
|
||||||
|
## Basic Workflow
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// 1. Get next ready task (returns TaskWithContext | null)
|
||||||
|
const task = await tasks.nextReady();
|
||||||
|
if (!task) return "No ready tasks";
|
||||||
|
|
||||||
|
// 2. Review context (available on TaskWithContext)
|
||||||
|
console.log(task.context.own); // This task's context
|
||||||
|
console.log(task.context.parent); // Parent's context (if depth > 0)
|
||||||
|
console.log(task.context.milestone); // Root milestone context (if depth > 1)
|
||||||
|
console.log(task.learnings.own); // Learnings attached to this task (bubbled from children)
|
||||||
|
|
||||||
|
// 3. Start work (VCS required - creates bookmark, records start commit)
|
||||||
|
await tasks.start(task.id);
|
||||||
|
|
||||||
|
// 4. Implement...
|
||||||
|
|
||||||
|
// 5. Complete with learnings (VCS required - commits changes, bubbles learnings to parent)
|
||||||
|
await tasks.complete(task.id, {
|
||||||
|
result: "Implemented login endpoint with JWT tokens",
|
||||||
|
learnings: ["bcrypt rounds should be 12 for production"]
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
See @file references/workflow.md for detailed workflow guidance.
|
||||||
|
|
||||||
|
## Understanding Task Context
|
||||||
|
|
||||||
|
Tasks have **progressive context** - inherited from ancestors:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.get(taskId); // Returns TaskWithContext
|
||||||
|
// task.context.own - this task's context (always present)
|
||||||
|
// task.context.parent - parent task's context (if depth > 0)
|
||||||
|
// task.context.milestone - root milestone's context (if depth > 1)
|
||||||
|
|
||||||
|
// Task's own learnings (bubbled from completed children)
|
||||||
|
// task.learnings.own - learnings attached to this task
|
||||||
|
```
|
||||||
|
|
||||||
|
## Return Type Summary
|
||||||
|
|
||||||
|
| Method | Returns | Notes |
|
||||||
|
|--------|---------|-------|
|
||||||
|
| `tasks.get(id)` | `TaskWithContext` | Full context chain + inherited learnings |
|
||||||
|
| `tasks.nextReady()` | `TaskWithContext \| null` | Deepest ready leaf with full context |
|
||||||
|
| `tasks.list()` | `Task[]` | Basic task fields only |
|
||||||
|
| `tasks.create()` | `Task` | No context chain |
|
||||||
|
| `tasks.start/complete()` | `Task` | No context chain |
|
||||||
|
|
||||||
|
## Blockers
|
||||||
|
|
||||||
|
Blockers prevent a task from being ready until the blocker completes.
|
||||||
|
|
||||||
|
**Constraints:**
|
||||||
|
- Blockers cannot be self
|
||||||
|
- Blockers cannot be ancestors (parent, grandparent, etc.)
|
||||||
|
- Blockers cannot be descendants
|
||||||
|
- Creating/reparenting with invalid blockers is rejected
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Add blocker - taskA waits for taskB
|
||||||
|
await tasks.block(taskA.id, taskB.id);
|
||||||
|
|
||||||
|
// Remove blocker
|
||||||
|
await tasks.unblock(taskA.id, taskB.id);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Task Hierarchies
|
||||||
|
|
||||||
|
Three levels: **Milestone** (depth 0) -> **Task** (depth 1) -> **Subtask** (depth 2).
|
||||||
|
|
||||||
|
| Level | Name | Purpose | Example |
|
||||||
|
|-------|------|---------|---------|
|
||||||
|
| 0 | **Milestone** | Large initiative | "Add user authentication system" |
|
||||||
|
| 1 | **Task** | Significant work item | "Implement JWT middleware" |
|
||||||
|
| 2 | **Subtask** | Atomic step | "Add token verification function" |
|
||||||
|
|
||||||
|
**Choosing the right level:**
|
||||||
|
- Small feature (1-2 files) -> Single task
|
||||||
|
- Medium feature (3-7 steps) -> Task with subtasks
|
||||||
|
- Large initiative (5+ tasks) -> Milestone with tasks
|
||||||
|
|
||||||
|
See @file references/hierarchies.md for detailed guidance.
|
||||||
|
|
||||||
|
## Recording Results
|
||||||
|
|
||||||
|
Complete tasks **immediately after implementing AND verifying**:
|
||||||
|
- Capture decisions while fresh
|
||||||
|
- Note deviations from plan
|
||||||
|
- Document verification performed
|
||||||
|
- Create follow-up tasks for tech debt
|
||||||
|
|
||||||
|
Your result must include explicit verification evidence. See @file references/verification.md.
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
1. **Right-size tasks**: Completable in one focused session
|
||||||
|
2. **Clear completion criteria**: Context should define "done"
|
||||||
|
3. **Don't over-decompose**: 3-7 children per parent
|
||||||
|
4. **Action-oriented descriptions**: Start with verbs ("Add", "Fix", "Update")
|
||||||
|
5. **Verify before completing**: Tests passing, manual testing done
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Reading Order
|
||||||
|
|
||||||
|
| Task | File |
|
||||||
|
|------|------|
|
||||||
|
| Understanding API | @file references/api.md |
|
||||||
|
| Implementation workflow | @file references/workflow.md |
|
||||||
|
| Task decomposition | @file references/hierarchies.md |
|
||||||
|
| Good/bad examples | @file references/examples.md |
|
||||||
|
| Verification checklist | @file references/verification.md |
|
||||||
|
|
||||||
|
## In This Reference
|
||||||
|
|
||||||
|
| File | Purpose |
|
||||||
|
|------|---------|
|
||||||
|
| `references/api.md` | Overseer MCP codemode API types/methods |
|
||||||
|
| `references/workflow.md` | Start->implement->complete workflow |
|
||||||
|
| `references/hierarchies.md` | Milestone/task/subtask organization |
|
||||||
|
| `references/examples.md` | Good/bad context and result examples |
|
||||||
|
| `references/verification.md` | Verification checklist and process |
|
||||||
192
profiles/opencode/skill/overseer/references/api.md
Normal file
192
profiles/opencode/skill/overseer/references/api.md
Normal file
@@ -0,0 +1,192 @@
|
|||||||
|
# Overseer Codemode MCP API
|
||||||
|
|
||||||
|
Execute JavaScript code to interact with Overseer task management.
|
||||||
|
|
||||||
|
## Task Interface
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
// Basic task - returned by list(), create(), start(), complete()
|
||||||
|
// Note: Does NOT include context or learnings fields
|
||||||
|
interface Task {
|
||||||
|
id: string;
|
||||||
|
parentId: string | null;
|
||||||
|
description: string;
|
||||||
|
priority: 1 | 2 | 3 | 4 | 5;
|
||||||
|
completed: boolean;
|
||||||
|
completedAt: string | null;
|
||||||
|
startedAt: string | null;
|
||||||
|
createdAt: string; // ISO 8601
|
||||||
|
updatedAt: string;
|
||||||
|
result: string | null; // Completion notes
|
||||||
|
commitSha: string | null; // Auto-populated on complete
|
||||||
|
depth: 0 | 1 | 2; // 0=milestone, 1=task, 2=subtask
|
||||||
|
blockedBy?: string[]; // Blocking task IDs (omitted if empty)
|
||||||
|
blocks?: string[]; // Tasks this blocks (omitted if empty)
|
||||||
|
bookmark?: string; // VCS bookmark name (if started)
|
||||||
|
startCommit?: string; // Commit SHA at start
|
||||||
|
effectivelyBlocked: boolean; // True if task OR ancestor has incomplete blockers
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task with full context - returned by get(), nextReady()
|
||||||
|
interface TaskWithContext extends Task {
|
||||||
|
context: {
|
||||||
|
own: string; // This task's context
|
||||||
|
parent?: string; // Parent's context (depth > 0)
|
||||||
|
milestone?: string; // Root milestone's context (depth > 1)
|
||||||
|
};
|
||||||
|
learnings: {
|
||||||
|
own: Learning[]; // This task's learnings (bubbled from completed children)
|
||||||
|
parent: Learning[]; // Parent's learnings (depth > 0)
|
||||||
|
milestone: Learning[]; // Milestone's learnings (depth > 1)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task tree structure - returned by tree()
|
||||||
|
interface TaskTree {
|
||||||
|
task: Task;
|
||||||
|
children: TaskTree[];
|
||||||
|
}
|
||||||
|
|
||||||
|
// Progress summary - returned by progress()
|
||||||
|
interface TaskProgress {
|
||||||
|
total: number;
|
||||||
|
completed: number;
|
||||||
|
ready: number; // !completed && !effectivelyBlocked
|
||||||
|
blocked: number; // !completed && effectivelyBlocked
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task type alias for depth filter
|
||||||
|
type TaskType = "milestone" | "task" | "subtask";
|
||||||
|
```
|
||||||
|
|
||||||
|
## Learning Interface
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
interface Learning {
|
||||||
|
id: string;
|
||||||
|
taskId: string;
|
||||||
|
content: string;
|
||||||
|
sourceTaskId: string | null;
|
||||||
|
createdAt: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Tasks API
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
declare const tasks: {
|
||||||
|
list(filter?: {
|
||||||
|
parentId?: string;
|
||||||
|
ready?: boolean;
|
||||||
|
completed?: boolean;
|
||||||
|
depth?: 0 | 1 | 2; // 0=milestones, 1=tasks, 2=subtasks
|
||||||
|
type?: TaskType; // Alias: "milestone"|"task"|"subtask" (mutually exclusive with depth)
|
||||||
|
}): Promise<Task[]>;
|
||||||
|
get(id: string): Promise<TaskWithContext>;
|
||||||
|
create(input: {
|
||||||
|
description: string;
|
||||||
|
context?: string;
|
||||||
|
parentId?: string;
|
||||||
|
priority?: 1 | 2 | 3 | 4 | 5; // Required range: 1-5
|
||||||
|
blockedBy?: string[];
|
||||||
|
}): Promise<Task>;
|
||||||
|
update(id: string, input: {
|
||||||
|
description?: string;
|
||||||
|
context?: string;
|
||||||
|
priority?: 1 | 2 | 3 | 4 | 5;
|
||||||
|
parentId?: string;
|
||||||
|
}): Promise<Task>;
|
||||||
|
start(id: string): Promise<Task>;
|
||||||
|
complete(id: string, input?: { result?: string; learnings?: string[] }): Promise<Task>;
|
||||||
|
reopen(id: string): Promise<Task>;
|
||||||
|
delete(id: string): Promise<void>;
|
||||||
|
block(taskId: string, blockerId: string): Promise<void>;
|
||||||
|
unblock(taskId: string, blockerId: string): Promise<void>;
|
||||||
|
nextReady(milestoneId?: string): Promise<TaskWithContext | null>;
|
||||||
|
tree(rootId?: string): Promise<TaskTree | TaskTree[]>;
|
||||||
|
search(query: string): Promise<Task[]>;
|
||||||
|
progress(rootId?: string): Promise<TaskProgress>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
| Method | Returns | Description |
|
||||||
|
|--------|---------|-------------|
|
||||||
|
| `list` | `Task[]` | Filter by `parentId`, `ready`, `completed`, `depth`, `type` |
|
||||||
|
| `get` | `TaskWithContext` | Get task with full context chain + inherited learnings |
|
||||||
|
| `create` | `Task` | Create task (priority must be 1-5) |
|
||||||
|
| `update` | `Task` | Update description, context, priority, parentId |
|
||||||
|
| `start` | `Task` | **VCS required** - creates bookmark, records start commit |
|
||||||
|
| `complete` | `Task` | **VCS required** - commits changes + bubbles learnings to parent |
|
||||||
|
| `reopen` | `Task` | Reopen completed task |
|
||||||
|
| `delete` | `void` | Delete task + best-effort VCS bookmark cleanup |
|
||||||
|
| `block` | `void` | Add blocker (cannot be self, ancestor, or descendant) |
|
||||||
|
| `unblock` | `void` | Remove blocker relationship |
|
||||||
|
| `nextReady` | `TaskWithContext \| null` | Get deepest ready leaf with full context |
|
||||||
|
| `tree` | `TaskTree \| TaskTree[]` | Get task tree (all milestones if no ID) |
|
||||||
|
| `search` | `Task[]` | Search by description/context/result (case-insensitive) |
|
||||||
|
| `progress` | `TaskProgress` | Aggregate counts for milestone or all tasks |
|
||||||
|
|
||||||
|
## Learnings API
|
||||||
|
|
||||||
|
Learnings are added via `tasks.complete(id, { learnings: [...] })` and bubble to immediate parent (preserving `sourceTaskId`).
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
declare const learnings: {
|
||||||
|
list(taskId: string): Promise<Learning[]>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
| Method | Description |
|
||||||
|
|--------|-------------|
|
||||||
|
| `list` | List learnings for task |
|
||||||
|
|
||||||
|
## VCS Integration (Required for Workflow)
|
||||||
|
|
||||||
|
VCS operations are **automatically handled** by the tasks API:
|
||||||
|
|
||||||
|
| Task Operation | VCS Effect |
|
||||||
|
|----------------|------------|
|
||||||
|
| `tasks.start(id)` | **VCS required** - creates bookmark `task/<id>`, records start commit |
|
||||||
|
| `tasks.complete(id)` | **VCS required** - commits changes (NothingToCommit = success) |
|
||||||
|
| `tasks.delete(id)` | Best-effort bookmark cleanup (logs warning on failure) |
|
||||||
|
|
||||||
|
**VCS (jj or git) is required** for start/complete. Fails with `NotARepository` if none found. CRUD operations work without VCS.
|
||||||
|
|
||||||
|
## Quick Examples
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create milestone with subtask
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Build authentication system",
|
||||||
|
context: "JWT-based auth with refresh tokens",
|
||||||
|
priority: 1
|
||||||
|
});
|
||||||
|
|
||||||
|
const subtask = await tasks.create({
|
||||||
|
description: "Implement token refresh logic",
|
||||||
|
parentId: milestone.id,
|
||||||
|
context: "Handle 7-day expiry"
|
||||||
|
});
|
||||||
|
|
||||||
|
// Start work (auto-creates VCS bookmark)
|
||||||
|
await tasks.start(subtask.id);
|
||||||
|
|
||||||
|
// ... do implementation work ...
|
||||||
|
|
||||||
|
// Complete task with learnings (VCS required - commits changes, bubbles learnings to parent)
|
||||||
|
await tasks.complete(subtask.id, {
|
||||||
|
result: "Implemented using jose library",
|
||||||
|
learnings: ["Use jose instead of jsonwebtoken"]
|
||||||
|
});
|
||||||
|
|
||||||
|
// Get progress summary
|
||||||
|
const progress = await tasks.progress(milestone.id);
|
||||||
|
// -> { total: 2, completed: 1, ready: 1, blocked: 0 }
|
||||||
|
|
||||||
|
// Search tasks
|
||||||
|
const authTasks = await tasks.search("authentication");
|
||||||
|
|
||||||
|
// Get task tree
|
||||||
|
const tree = await tasks.tree(milestone.id);
|
||||||
|
// -> { task: Task, children: TaskTree[] }
|
||||||
|
```
|
||||||
195
profiles/opencode/skill/overseer/references/examples.md
Normal file
195
profiles/opencode/skill/overseer/references/examples.md
Normal file
@@ -0,0 +1,195 @@
|
|||||||
|
# Examples
|
||||||
|
|
||||||
|
Good and bad examples for writing task context and results.
|
||||||
|
|
||||||
|
## Writing Context
|
||||||
|
|
||||||
|
Context should include everything needed to do the work without asking questions:
|
||||||
|
- **What** needs to be done and why
|
||||||
|
- **Implementation approach** (steps, files to modify, technical choices)
|
||||||
|
- **Done when** (acceptance criteria)
|
||||||
|
|
||||||
|
### Good Context Example
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.create({
|
||||||
|
description: "Migrate storage to one file per task",
|
||||||
|
context: `Change storage format for git-friendliness:
|
||||||
|
|
||||||
|
Structure:
|
||||||
|
.overseer/
|
||||||
|
└── tasks/
|
||||||
|
├── task_01ABC.json
|
||||||
|
└── task_02DEF.json
|
||||||
|
|
||||||
|
NO INDEX - just scan task files. For typical task counts (<100), this is fast.
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
1. Update storage.ts:
|
||||||
|
- read(): Scan .overseer/tasks/*.json, parse each, return TaskStore
|
||||||
|
- write(task): Write single task to .overseer/tasks/{id}.json
|
||||||
|
- delete(id): Remove .overseer/tasks/{id}.json
|
||||||
|
- Add readTask(id) for single task lookup
|
||||||
|
|
||||||
|
2. Task file format: Same as current Task schema (one task per file)
|
||||||
|
|
||||||
|
3. Migration: On read, if old tasks.json exists, migrate to new format
|
||||||
|
|
||||||
|
4. Update tests
|
||||||
|
|
||||||
|
Benefits:
|
||||||
|
- Create = new file (never conflicts)
|
||||||
|
- Update = single file change
|
||||||
|
- Delete = remove file
|
||||||
|
- No index to maintain or conflict
|
||||||
|
- git diff shows exactly which tasks changed`
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why it works:** States the goal, shows the structure, lists specific implementation steps, explains benefits. Someone could pick this up without asking questions.
|
||||||
|
|
||||||
|
### Bad Context Example
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.create({
|
||||||
|
description: "Add auth",
|
||||||
|
context: "Need to add authentication"
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**What's missing:** How to implement it, what files, what's done when, technical approach.
|
||||||
|
|
||||||
|
## Writing Results
|
||||||
|
|
||||||
|
Results should capture what was actually done:
|
||||||
|
- **What changed** (implementation summary)
|
||||||
|
- **Key decisions** (and why)
|
||||||
|
- **Verification** (tests passing, manual testing done)
|
||||||
|
|
||||||
|
### Good Result Example
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Migrated storage from single tasks.json to one file per task:
|
||||||
|
|
||||||
|
Structure:
|
||||||
|
- Each task stored as .overseer/tasks/{id}.json
|
||||||
|
- No index file (avoids merge conflicts)
|
||||||
|
- Directory scanned on read to build task list
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Modified Storage.read() to scan .overseer/tasks/ directory
|
||||||
|
- Modified Storage.write() to write/delete individual task files
|
||||||
|
- Auto-migration from old single-file format on first read
|
||||||
|
- Atomic writes using temp file + rename pattern
|
||||||
|
|
||||||
|
Trade-offs:
|
||||||
|
- Slightly slower reads (must scan directory + parse each file)
|
||||||
|
- Acceptable since task count is typically small (<100)
|
||||||
|
- Better git history - each task change is isolated
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 60 tests passing
|
||||||
|
- Build successful
|
||||||
|
- Manually tested migration: old -> new format works`);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Why it works:** States what changed, lists implementation details, explains trade-offs, confirms verification.
|
||||||
|
|
||||||
|
### Bad Result Example
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, "Fixed the storage issue");
|
||||||
|
```
|
||||||
|
|
||||||
|
**What's missing:** What was actually implemented, how, what decisions were made, verification evidence.
|
||||||
|
|
||||||
|
## Subtask Context Example
|
||||||
|
|
||||||
|
Link subtasks to their parent and explain what this piece does specifically:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.create({
|
||||||
|
description: "Add token verification function",
|
||||||
|
parentId: jwtTaskId,
|
||||||
|
context: `Part of JWT middleware (parent task). This subtask: token verification.
|
||||||
|
|
||||||
|
What it does:
|
||||||
|
- Verify JWT signature and expiration on protected routes
|
||||||
|
- Extract user ID from token payload
|
||||||
|
- Attach user object to request
|
||||||
|
- Return 401 for invalid/expired tokens
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Create src/middleware/verify-token.ts
|
||||||
|
- Export verifyToken middleware function
|
||||||
|
- Use jose library (preferred over jsonwebtoken)
|
||||||
|
- Handle expired vs invalid token cases separately
|
||||||
|
|
||||||
|
Done when:
|
||||||
|
- Middleware function complete and working
|
||||||
|
- Unit tests cover valid/invalid/expired scenarios
|
||||||
|
- Integrated into auth routes in server.ts
|
||||||
|
- Parent task can use this to protect endpoints`
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling Examples
|
||||||
|
|
||||||
|
### Handling Pending Children
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
try {
|
||||||
|
await tasks.complete(taskId, "Done");
|
||||||
|
} catch (err) {
|
||||||
|
if (err.message.includes("pending children")) {
|
||||||
|
const pending = await tasks.list({ parentId: taskId, completed: false });
|
||||||
|
console.log(`Cannot complete: ${pending.length} children pending`);
|
||||||
|
for (const child of pending) {
|
||||||
|
console.log(`- ${child.id}: ${child.description}`);
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Handling Blocked Tasks
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.get(taskId);
|
||||||
|
|
||||||
|
if (task.blockedBy.length > 0) {
|
||||||
|
console.log("Task is blocked by:");
|
||||||
|
for (const blockerId of task.blockedBy) {
|
||||||
|
const blocker = await tasks.get(blockerId);
|
||||||
|
console.log(`- ${blocker.description} (${blocker.completed ? 'done' : 'pending'})`);
|
||||||
|
}
|
||||||
|
return "Cannot start - blocked by other tasks";
|
||||||
|
}
|
||||||
|
|
||||||
|
await tasks.start(taskId);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Creating Task Hierarchies
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create milestone with tasks
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Implement user authentication",
|
||||||
|
context: "Full auth: JWT, login/logout, password reset, rate limiting",
|
||||||
|
priority: 2
|
||||||
|
});
|
||||||
|
|
||||||
|
const subtasks = [
|
||||||
|
"Add login endpoint",
|
||||||
|
"Add logout endpoint",
|
||||||
|
"Implement JWT token service",
|
||||||
|
"Add password reset flow"
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const desc of subtasks) {
|
||||||
|
await tasks.create({ description: desc, parentId: milestone.id });
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
See @file references/hierarchies.md for sequential subtasks with blockers.
|
||||||
170
profiles/opencode/skill/overseer/references/hierarchies.md
Normal file
170
profiles/opencode/skill/overseer/references/hierarchies.md
Normal file
@@ -0,0 +1,170 @@
|
|||||||
|
# Task Hierarchies
|
||||||
|
|
||||||
|
Guidance for organizing work into milestones, tasks, and subtasks.
|
||||||
|
|
||||||
|
## Three Levels
|
||||||
|
|
||||||
|
| Level | Name | Purpose | Example |
|
||||||
|
|-------|------|---------|---------|
|
||||||
|
| 0 | **Milestone** | Large initiative (5+ tasks) | "Add user authentication system" |
|
||||||
|
| 1 | **Task** | Significant work item | "Implement JWT middleware" |
|
||||||
|
| 2 | **Subtask** | Atomic implementation step | "Add token verification function" |
|
||||||
|
|
||||||
|
**Maximum depth is 3 levels.** Attempting to create a child of a subtask will fail.
|
||||||
|
|
||||||
|
## When to Use Each Level
|
||||||
|
|
||||||
|
### Single Task (No Hierarchy)
|
||||||
|
- Small feature (1-2 files, ~1 session)
|
||||||
|
- Work is atomic, no natural breakdown
|
||||||
|
|
||||||
|
### Task with Subtasks
|
||||||
|
- Medium feature (3-5 files, 3-7 steps)
|
||||||
|
- Work naturally decomposes into discrete steps
|
||||||
|
- Subtasks could be worked on independently
|
||||||
|
|
||||||
|
### Milestone with Tasks
|
||||||
|
- Large initiative (multiple areas, many sessions)
|
||||||
|
- Work spans 5+ distinct tasks
|
||||||
|
- You want high-level progress tracking
|
||||||
|
|
||||||
|
## Creating Hierarchies
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create the milestone
|
||||||
|
const milestone = await tasks.create({
|
||||||
|
description: "Add user authentication system",
|
||||||
|
context: "Full auth system with JWT tokens, password reset...",
|
||||||
|
priority: 2
|
||||||
|
});
|
||||||
|
|
||||||
|
// Create tasks under it
|
||||||
|
const jwtTask = await tasks.create({
|
||||||
|
description: "Implement JWT token generation",
|
||||||
|
context: "Create token service with signing and verification...",
|
||||||
|
parentId: milestone.id
|
||||||
|
});
|
||||||
|
|
||||||
|
const resetTask = await tasks.create({
|
||||||
|
description: "Add password reset flow",
|
||||||
|
context: "Email-based password reset with secure tokens...",
|
||||||
|
parentId: milestone.id
|
||||||
|
});
|
||||||
|
|
||||||
|
// For complex tasks, add subtasks
|
||||||
|
const verifySubtask = await tasks.create({
|
||||||
|
description: "Add token verification function",
|
||||||
|
context: "Verify JWT signature and expiration...",
|
||||||
|
parentId: jwtTask.id
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Subtask Best Practices
|
||||||
|
|
||||||
|
Each subtask should be:
|
||||||
|
|
||||||
|
- **Independently understandable**: Clear on its own
|
||||||
|
- **Linked to parent**: Reference parent, explain how this piece fits
|
||||||
|
- **Specific scope**: What this subtask does vs what parent/siblings do
|
||||||
|
- **Clear completion**: Define "done" for this piece specifically
|
||||||
|
|
||||||
|
Example subtask context:
|
||||||
|
```
|
||||||
|
Part of JWT middleware (parent task). This subtask: token verification.
|
||||||
|
|
||||||
|
What it does:
|
||||||
|
- Verify JWT signature and expiration
|
||||||
|
- Extract user ID from payload
|
||||||
|
- Return 401 for invalid/expired tokens
|
||||||
|
|
||||||
|
Done when:
|
||||||
|
- Function complete and tested
|
||||||
|
- Unit tests cover valid/invalid/expired cases
|
||||||
|
```
|
||||||
|
|
||||||
|
## Decomposition Strategy
|
||||||
|
|
||||||
|
When faced with large tasks:
|
||||||
|
|
||||||
|
1. **Assess scope**: Is this milestone-level (5+ tasks) or task-level (3-7 subtasks)?
|
||||||
|
2. Create parent task/milestone with overall goal and context
|
||||||
|
3. Analyze and identify 3-7 logical children
|
||||||
|
4. Create children with specific contexts and boundaries
|
||||||
|
5. Work through systematically, completing with results
|
||||||
|
6. Complete parent with summary of overall implementation
|
||||||
|
|
||||||
|
### Don't Over-Decompose
|
||||||
|
|
||||||
|
- **3-7 children per parent** is usually right
|
||||||
|
- If you'd only have 1-2 subtasks, just make separate tasks
|
||||||
|
- If you need depth 3+, restructure your breakdown
|
||||||
|
|
||||||
|
## Viewing Hierarchies
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// List all tasks under a milestone
|
||||||
|
const children = await tasks.list({ parentId: milestoneId });
|
||||||
|
|
||||||
|
// Get task with context breadcrumb
|
||||||
|
const task = await tasks.get(taskId);
|
||||||
|
// task.context.parent - parent's context
|
||||||
|
// task.context.milestone - root milestone's context
|
||||||
|
|
||||||
|
// Check progress
|
||||||
|
const pending = await tasks.list({ parentId: milestoneId, completed: false });
|
||||||
|
const done = await tasks.list({ parentId: milestoneId, completed: true });
|
||||||
|
console.log(`Progress: ${done.length}/${done.length + pending.length}`);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Completion Rules
|
||||||
|
|
||||||
|
1. **Cannot complete with pending children**
|
||||||
|
```javascript
|
||||||
|
// This will fail if task has incomplete subtasks
|
||||||
|
await tasks.complete(taskId, "Done");
|
||||||
|
// Error: "pending children"
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Complete children first**
|
||||||
|
- Work through subtasks systematically
|
||||||
|
- Complete each with meaningful results
|
||||||
|
|
||||||
|
3. **Parent result summarizes overall implementation**
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(milestoneId, `User authentication system complete:
|
||||||
|
|
||||||
|
Implemented:
|
||||||
|
- JWT token generation and verification
|
||||||
|
- Login/logout endpoints
|
||||||
|
- Password reset flow
|
||||||
|
- Rate limiting
|
||||||
|
|
||||||
|
5 tasks completed, all tests passing.`);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Blocking Dependencies
|
||||||
|
|
||||||
|
Use `blockedBy` for cross-task dependencies:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Create task that depends on another
|
||||||
|
const deployTask = await tasks.create({
|
||||||
|
description: "Deploy to production",
|
||||||
|
context: "...",
|
||||||
|
blockedBy: [testTaskId, reviewTaskId]
|
||||||
|
});
|
||||||
|
|
||||||
|
// Add blocker to existing task
|
||||||
|
await tasks.block(deployTaskId, testTaskId);
|
||||||
|
|
||||||
|
// Remove blocker
|
||||||
|
await tasks.unblock(deployTaskId, testTaskId);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Use blockers when:**
|
||||||
|
- Task B cannot start until Task A completes
|
||||||
|
- Multiple tasks depend on a shared prerequisite
|
||||||
|
|
||||||
|
**Don't use blockers when:**
|
||||||
|
- Tasks can be worked on in parallel
|
||||||
|
- The dependency is just logical grouping (use subtasks instead)
|
||||||
186
profiles/opencode/skill/overseer/references/verification.md
Normal file
186
profiles/opencode/skill/overseer/references/verification.md
Normal file
@@ -0,0 +1,186 @@
|
|||||||
|
# Verification Guide
|
||||||
|
|
||||||
|
Before marking any task complete, you MUST verify your work. Verification separates "I think it's done" from "it's actually done."
|
||||||
|
|
||||||
|
## The Verification Process
|
||||||
|
|
||||||
|
1. **Re-read the task context**: What did you originally commit to do?
|
||||||
|
2. **Check acceptance criteria**: Does your implementation satisfy the "Done when" conditions?
|
||||||
|
3. **Run relevant tests**: Execute the test suite and document results
|
||||||
|
4. **Test manually**: Actually try the feature/change yourself
|
||||||
|
5. **Compare with requirements**: Does what you built match what was asked?
|
||||||
|
|
||||||
|
## Strong vs Weak Verification
|
||||||
|
|
||||||
|
### Strong Verification Examples
|
||||||
|
|
||||||
|
- "All 60 tests passing, build successful"
|
||||||
|
- "All 69 tests passing (4 new tests for middleware edge cases)"
|
||||||
|
- "Manually tested with valid/invalid/expired tokens - all cases work"
|
||||||
|
- "Ran `cargo test` - 142 tests passed, 0 failed"
|
||||||
|
|
||||||
|
### Weak Verification (Avoid)
|
||||||
|
|
||||||
|
- "Should work now" - "should" means not verified
|
||||||
|
- "Made the changes" - no evidence it works
|
||||||
|
- "Added tests" - did the tests pass? What's the count?
|
||||||
|
- "Fixed the bug" - what bug? Did you verify the fix?
|
||||||
|
- "Done" - done how? prove it
|
||||||
|
|
||||||
|
## Verification by Task Type
|
||||||
|
|
||||||
|
| Task Type | How to Verify |
|
||||||
|
|-----------|---------------|
|
||||||
|
| Code changes | Run full test suite, document passing count |
|
||||||
|
| New features | Run tests + manual testing of functionality |
|
||||||
|
| Configuration | Test the config works (run commands, check workflows) |
|
||||||
|
| Documentation | Verify examples work, links resolve, formatting renders |
|
||||||
|
| Refactoring | Confirm tests still pass, no behavior changes |
|
||||||
|
| Bug fixes | Reproduce bug first, verify fix, add regression test |
|
||||||
|
|
||||||
|
## Cross-Reference Checklist
|
||||||
|
|
||||||
|
Before marking complete, verify all applicable items:
|
||||||
|
|
||||||
|
- [ ] Task description requirements met
|
||||||
|
- [ ] Context "Done when" criteria satisfied
|
||||||
|
- [ ] Tests passing (document count: "All X tests passing")
|
||||||
|
- [ ] Build succeeds (if applicable)
|
||||||
|
- [ ] Manual testing done (describe what you tested)
|
||||||
|
- [ ] No regressions introduced
|
||||||
|
- [ ] Edge cases considered (error handling, invalid input)
|
||||||
|
- [ ] Follow-up work identified (created new tasks if needed)
|
||||||
|
|
||||||
|
**If you can't check all applicable boxes, the task isn't done yet.**
|
||||||
|
|
||||||
|
## Result Examples with Verification
|
||||||
|
|
||||||
|
### Code Implementation
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Implemented JWT middleware:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Created src/middleware/verify-token.ts
|
||||||
|
- Separated 'expired' vs 'invalid' error codes
|
||||||
|
- Added user extraction from payload
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 69 tests passing (4 new tests for edge cases)
|
||||||
|
- Manually tested with valid token: Access granted
|
||||||
|
- Manually tested with expired token: 401 with 'token_expired'
|
||||||
|
- Manually tested with invalid signature: 401 with 'invalid_token'`);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration/Infrastructure
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Added GitHub Actions workflow for CI:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Created .github/workflows/ci.yml
|
||||||
|
- Jobs: lint, test, build with pnpm cache
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- Pushed to test branch, opened PR #123
|
||||||
|
- Workflow triggered automatically
|
||||||
|
- All jobs passed (lint: 0 errors, test: 69/69, build: success)
|
||||||
|
- Total run time: 2m 34s`);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Refactoring
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Refactored storage to one file per task:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Split tasks.json into .overseer/tasks/{id}.json files
|
||||||
|
- Added auto-migration from old format
|
||||||
|
- Atomic writes via temp+rename
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 60 tests passing (including 8 storage tests)
|
||||||
|
- Build successful
|
||||||
|
- Manually tested migration: old -> new format works
|
||||||
|
- Confirmed git diff shows only changed tasks`);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Bug Fix
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Fixed login validation accepting usernames with spaces:
|
||||||
|
|
||||||
|
Root cause:
|
||||||
|
- Validation regex didn't account for leading/trailing spaces
|
||||||
|
|
||||||
|
Fix:
|
||||||
|
- Added .trim() before validation in src/auth/validate.ts:42
|
||||||
|
- Updated regex to reject internal spaces
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 45 tests passing (2 new regression tests)
|
||||||
|
- Manually tested:
|
||||||
|
- " admin" -> rejected (leading space)
|
||||||
|
- "admin " -> rejected (trailing space)
|
||||||
|
- "ad min" -> rejected (internal space)
|
||||||
|
- "admin" -> accepted`);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, `Updated API documentation for auth endpoints:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Added docs for POST /auth/login
|
||||||
|
- Added docs for POST /auth/logout
|
||||||
|
- Added docs for POST /auth/refresh
|
||||||
|
- Included example requests/responses
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All code examples tested and working
|
||||||
|
- Links verified (no 404s)
|
||||||
|
- Rendered in local preview - formatting correct
|
||||||
|
- Spell-checked content`);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Verification Mistakes
|
||||||
|
|
||||||
|
| Mistake | Better Approach |
|
||||||
|
|---------|-----------------|
|
||||||
|
| "Tests pass" | "All 42 tests passing" (include count) |
|
||||||
|
| "Manually tested" | "Manually tested X, Y, Z scenarios" (be specific) |
|
||||||
|
| "Works" | "Works: [evidence]" (show proof) |
|
||||||
|
| "Fixed" | "Fixed: [root cause] -> [solution] -> [verification]" |
|
||||||
|
|
||||||
|
## When Verification Fails
|
||||||
|
|
||||||
|
If verification reveals issues:
|
||||||
|
|
||||||
|
1. **Don't complete the task** - it's not done
|
||||||
|
2. **Document what failed** in task context
|
||||||
|
3. **Fix the issues** before completing
|
||||||
|
4. **Re-verify** after fixes
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Update context with failure notes
|
||||||
|
await tasks.update(taskId, {
|
||||||
|
context: task.context + `
|
||||||
|
|
||||||
|
Verification attempt 1 (failed):
|
||||||
|
- Tests: 41/42 passing
|
||||||
|
- Failing: test_token_refresh - timeout issue
|
||||||
|
- Need to investigate async handling`
|
||||||
|
});
|
||||||
|
|
||||||
|
// After fixing
|
||||||
|
await tasks.complete(taskId, `Implemented token refresh:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Added refresh endpoint
|
||||||
|
- Fixed async timeout (was missing await)
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 42 tests passing (fixed timeout issue)
|
||||||
|
- Manual testing: refresh works within 30s window`);
|
||||||
|
```
|
||||||
164
profiles/opencode/skill/overseer/references/workflow.md
Normal file
164
profiles/opencode/skill/overseer/references/workflow.md
Normal file
@@ -0,0 +1,164 @@
|
|||||||
|
# Implementation Workflow
|
||||||
|
|
||||||
|
Step-by-step guide for working with Overseer tasks during implementation.
|
||||||
|
|
||||||
|
## 1. Get Next Ready Task
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Get next task with full context (recommended)
|
||||||
|
const task = await tasks.nextReady();
|
||||||
|
|
||||||
|
// Or scope to specific milestone
|
||||||
|
const task = await tasks.nextReady(milestoneId);
|
||||||
|
|
||||||
|
if (!task) {
|
||||||
|
return "No tasks ready - all blocked or completed";
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`nextReady()` returns a `TaskWithContext` (task with inherited context and learnings) or `null`.
|
||||||
|
|
||||||
|
## 2. Review Context
|
||||||
|
|
||||||
|
Before starting, verify you can answer:
|
||||||
|
- **What** needs to be done specifically?
|
||||||
|
- **Why** is this needed?
|
||||||
|
- **How** should it be implemented?
|
||||||
|
- **When** is it done (acceptance criteria)?
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.get(taskId);
|
||||||
|
|
||||||
|
// Task's own context
|
||||||
|
console.log("Task:", task.context.own);
|
||||||
|
|
||||||
|
// Parent context (if task has parent)
|
||||||
|
if (task.context.parent) {
|
||||||
|
console.log("Parent:", task.context.parent);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Milestone context (if depth > 1)
|
||||||
|
if (task.context.milestone) {
|
||||||
|
console.log("Milestone:", task.context.milestone);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Task's own learnings (bubbled from completed children)
|
||||||
|
console.log("Task learnings:", task.learnings.own);
|
||||||
|
```
|
||||||
|
|
||||||
|
**If any answer is unclear:**
|
||||||
|
1. Check parent task or completed blockers for details
|
||||||
|
2. Suggest entering plan mode to flesh out requirements
|
||||||
|
|
||||||
|
**Proceed without full context when:**
|
||||||
|
- Task is trivial/atomic (e.g., "Add .gitignore entry")
|
||||||
|
- Conversation already provides the missing context
|
||||||
|
- Description itself is sufficiently detailed
|
||||||
|
|
||||||
|
## 3. Start Task
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.start(taskId);
|
||||||
|
```
|
||||||
|
|
||||||
|
**VCS Required:** Creates bookmark `task/<id>`, records start commit. Fails with `NotARepository` if no jj/git found.
|
||||||
|
|
||||||
|
After starting, the task status changes to `in_progress`.
|
||||||
|
|
||||||
|
## 4. Implement
|
||||||
|
|
||||||
|
Work on the task implementation. Note any learnings to include when completing.
|
||||||
|
|
||||||
|
## 5. Verify Work
|
||||||
|
|
||||||
|
Before completing, verify your implementation. See @file references/verification.md for full checklist.
|
||||||
|
|
||||||
|
Quick checklist:
|
||||||
|
- [ ] Task description requirements met
|
||||||
|
- [ ] Context "Done when" criteria satisfied
|
||||||
|
- [ ] Tests passing (document count)
|
||||||
|
- [ ] Build succeeds
|
||||||
|
- [ ] Manual testing done
|
||||||
|
|
||||||
|
## 6. Complete Task with Learnings
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
await tasks.complete(taskId, {
|
||||||
|
result: `Implemented login endpoint:
|
||||||
|
|
||||||
|
Implementation:
|
||||||
|
- Created src/auth/login.ts
|
||||||
|
- Added JWT token generation
|
||||||
|
- Integrated with user service
|
||||||
|
|
||||||
|
Verification:
|
||||||
|
- All 42 tests passing (3 new)
|
||||||
|
- Manually tested valid/invalid credentials`,
|
||||||
|
learnings: [
|
||||||
|
"bcrypt rounds should be 12+ for production",
|
||||||
|
"jose library preferred over jsonwebtoken"
|
||||||
|
]
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**VCS Required:** Commits changes (NothingToCommit treated as success), then deletes the task's bookmark (best-effort) and clears the DB bookmark field on success. Fails with `NotARepository` if no jj/git found.
|
||||||
|
|
||||||
|
**Learnings Effect:** Learnings bubble to immediate parent only. `sourceTaskId` is preserved through bubbling, so if this task's learnings later bubble further, the origin is tracked.
|
||||||
|
|
||||||
|
The `result` becomes part of the task's permanent record.
|
||||||
|
|
||||||
|
## VCS Integration (Required for Workflow)
|
||||||
|
|
||||||
|
VCS operations are **automatically handled** by the tasks API:
|
||||||
|
|
||||||
|
| Task Operation | VCS Effect |
|
||||||
|
|----------------|------------|
|
||||||
|
| `tasks.start(id)` | **VCS required** - creates bookmark `task/<id>`, records start commit |
|
||||||
|
| `tasks.complete(id)` | **VCS required** - commits changes, deletes bookmark (best-effort), clears DB bookmark on success |
|
||||||
|
| `tasks.complete(milestoneId)` | Same + deletes ALL descendant bookmarks recursively (depth-1 and depth-2) |
|
||||||
|
| `tasks.delete(id)` | Best-effort bookmark cleanup (logs warning on failure) |
|
||||||
|
|
||||||
|
**Note:** VCS (jj or git) is required for start/complete. CRUD operations work without VCS.
|
||||||
|
|
||||||
|
## Error Handling
|
||||||
|
|
||||||
|
### Pending Children
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
try {
|
||||||
|
await tasks.complete(taskId, "Done");
|
||||||
|
} catch (err) {
|
||||||
|
if (err.message.includes("pending children")) {
|
||||||
|
const pending = await tasks.list({ parentId: taskId, completed: false });
|
||||||
|
return `Cannot complete: ${pending.length} children pending`;
|
||||||
|
}
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Task Not Ready
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.get(taskId);
|
||||||
|
|
||||||
|
// Check if blocked
|
||||||
|
if (task.blockedBy.length > 0) {
|
||||||
|
console.log("Blocked by:", task.blockedBy);
|
||||||
|
// Complete blockers first or unblock
|
||||||
|
await tasks.unblock(taskId, blockerId);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Complete Workflow Example
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const task = await tasks.nextReady();
|
||||||
|
if (!task) return "No ready tasks";
|
||||||
|
|
||||||
|
await tasks.start(task.id);
|
||||||
|
// ... implement ...
|
||||||
|
await tasks.complete(task.id, {
|
||||||
|
result: "Implemented: ... Verification: All 58 tests passing",
|
||||||
|
learnings: ["Use jose for JWT"]
|
||||||
|
});
|
||||||
|
```
|
||||||
122
profiles/opencode/skill/session-export/SKILL.md
Normal file
122
profiles/opencode/skill/session-export/SKILL.md
Normal file
@@ -0,0 +1,122 @@
|
|||||||
|
---
|
||||||
|
name: session-export
|
||||||
|
description: Update GitHub PR descriptions with AI session export summaries. Use when user asks to add session summary to PR/MR, document AI assistance in PR/MR, or export conversation summary to PR/MR description.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Session Export
|
||||||
|
|
||||||
|
Update PR/MR descriptions with a structured summary of the AI-assisted conversation.
|
||||||
|
|
||||||
|
## Output Format
|
||||||
|
|
||||||
|
```markdown
|
||||||
|
> [!NOTE]
|
||||||
|
> This PR was written with AI assistance.
|
||||||
|
|
||||||
|
<details><summary>AI Session Export</summary>
|
||||||
|
<p>
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"info": {
|
||||||
|
"title": "<brief task description>",
|
||||||
|
"agent": "opencode",
|
||||||
|
"models": ["<model(s) used>"]
|
||||||
|
},
|
||||||
|
"summary": [
|
||||||
|
"<action 1>",
|
||||||
|
"<action 2>",
|
||||||
|
...
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
</p>
|
||||||
|
</details>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
### 1. Export Session Data
|
||||||
|
|
||||||
|
Get session data using OpenCode CLI:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
opencode export [sessionID]
|
||||||
|
```
|
||||||
|
|
||||||
|
Returns JSON with session info including models used. Use current session if no sessionID provided.
|
||||||
|
|
||||||
|
### 2. Generate Summary JSON
|
||||||
|
|
||||||
|
From exported data and conversation context, create summary:
|
||||||
|
|
||||||
|
- **title**: 2-5 word task description (lowercase)
|
||||||
|
- **agent**: always "opencode"
|
||||||
|
- **models**: array from export data
|
||||||
|
- **summary**: array of terse action statements
|
||||||
|
- Use past tense ("added", "fixed", "created")
|
||||||
|
- Start with "user requested..." or "user asked..."
|
||||||
|
- Chronological order
|
||||||
|
- Attempt to keep the summary to a max of 25 turns ("user requested", "agent did")
|
||||||
|
- **NEVER include sensitive data**: API keys, credentials, secrets, tokens, passwords, env vars
|
||||||
|
|
||||||
|
### 3. Update PR/MR Description
|
||||||
|
|
||||||
|
**GitHub:**
|
||||||
|
```bash
|
||||||
|
gh pr edit <PR_NUMBER> --body "$(cat <<'EOF'
|
||||||
|
<existing description>
|
||||||
|
|
||||||
|
> [!NOTE]
|
||||||
|
> This PR was written with AI assistance.
|
||||||
|
|
||||||
|
<details><summary>AI Session Export</summary>
|
||||||
|
...
|
||||||
|
</details>
|
||||||
|
EOF
|
||||||
|
)"
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Preserve Existing Content
|
||||||
|
|
||||||
|
Always fetch and preserve existing PR/MR description:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# GitHub
|
||||||
|
gh pr view <PR_NUMBER> --json body -q '.body'
|
||||||
|
|
||||||
|
Append session export after existing content with blank line separator.
|
||||||
|
|
||||||
|
## Example Summary
|
||||||
|
|
||||||
|
For a session where user asked to add dark mode:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"info": {
|
||||||
|
"title": "dark mode implementation",
|
||||||
|
"agent": "opencode",
|
||||||
|
"models": ["claude sonnet 4"]
|
||||||
|
},
|
||||||
|
"summary": [
|
||||||
|
"user requested dark mode toggle in settings",
|
||||||
|
"agent explored existing theme system",
|
||||||
|
"agent created ThemeContext for state management",
|
||||||
|
"agent added DarkModeToggle component",
|
||||||
|
"agent updated CSS variables for dark theme",
|
||||||
|
"agent ran tests and fixed 2 failures",
|
||||||
|
"agent committed changes"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security
|
||||||
|
|
||||||
|
**NEVER include in summary:**
|
||||||
|
- API keys, tokens, secrets
|
||||||
|
- Passwords, credentials
|
||||||
|
- Environment variable values
|
||||||
|
- Private URLs with auth tokens
|
||||||
|
- Personal identifiable information
|
||||||
|
- Internal hostnames/IPs
|
||||||
464
profiles/opencode/skill/solidjs/SKILL.md
Normal file
464
profiles/opencode/skill/solidjs/SKILL.md
Normal file
@@ -0,0 +1,464 @@
|
|||||||
|
---
|
||||||
|
name: solidjs
|
||||||
|
description: |
|
||||||
|
SolidJS framework development skill for building reactive web applications with fine-grained reactivity.
|
||||||
|
Use when working with SolidJS projects including: (1) Creating components with signals, stores, and effects,
|
||||||
|
(2) Implementing reactive state management, (3) Using control flow components (Show, For, Switch/Match),
|
||||||
|
(4) Setting up routing with Solid Router, (5) Building full-stack apps with SolidStart,
|
||||||
|
(6) Data fetching with createResource, (7) Context API for shared state, (8) SSR/SSG configuration.
|
||||||
|
Triggers: solid, solidjs, solid-js, solid start, solidstart, createSignal, createStore, createEffect.
|
||||||
|
---
|
||||||
|
|
||||||
|
# SolidJS Development
|
||||||
|
|
||||||
|
SolidJS is a declarative JavaScript library for building user interfaces with fine-grained reactivity. Unlike virtual DOM frameworks, Solid compiles templates to real DOM nodes and updates them with fine-grained reactions.
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
1. **Components run once** — Component functions execute only during initialization, not on every update
|
||||||
|
2. **Fine-grained reactivity** — Only the specific DOM nodes that depend on changed data update
|
||||||
|
3. **No virtual DOM** — Direct DOM manipulation via compiled templates
|
||||||
|
4. **Signals are functions** — Access values by calling: `count()` not `count`
|
||||||
|
|
||||||
|
## Reactivity Primitives
|
||||||
|
|
||||||
|
### Signals — Basic State
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createSignal } from "solid-js";
|
||||||
|
|
||||||
|
const [count, setCount] = createSignal(0);
|
||||||
|
|
||||||
|
// Read value (getter)
|
||||||
|
console.log(count()); // 0
|
||||||
|
|
||||||
|
// Update value (setter)
|
||||||
|
setCount(1);
|
||||||
|
setCount(prev => prev + 1); // Functional update
|
||||||
|
```
|
||||||
|
|
||||||
|
**Options:**
|
||||||
|
```tsx
|
||||||
|
const [value, setValue] = createSignal(initialValue, {
|
||||||
|
equals: false, // Always trigger updates, even if value unchanged
|
||||||
|
name: "debugName" // For devtools
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Effects — Side Effects
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createEffect } from "solid-js";
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
console.log("Count changed:", count());
|
||||||
|
// Runs after render, re-runs when dependencies change
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
**Key behaviors:**
|
||||||
|
- Initial run: after render, before browser paint
|
||||||
|
- Subsequent runs: when tracked dependencies change
|
||||||
|
- Never runs during SSR or hydration
|
||||||
|
- Use `onCleanup` for cleanup logic
|
||||||
|
|
||||||
|
### Memos — Derived/Cached Values
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createMemo } from "solid-js";
|
||||||
|
|
||||||
|
const doubled = createMemo(() => count() * 2);
|
||||||
|
|
||||||
|
// Access like signal
|
||||||
|
console.log(doubled()); // Cached, only recalculates when count changes
|
||||||
|
```
|
||||||
|
|
||||||
|
Use memos when:
|
||||||
|
- Derived value is expensive to compute
|
||||||
|
- Derived value is accessed multiple times
|
||||||
|
- You want to prevent downstream updates when result unchanged
|
||||||
|
|
||||||
|
### Resources — Async Data
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createResource } from "solid-js";
|
||||||
|
|
||||||
|
const [user, { mutate, refetch }] = createResource(userId, fetchUser);
|
||||||
|
|
||||||
|
// In JSX
|
||||||
|
<Show when={!user.loading} fallback={<Loading />}>
|
||||||
|
<div>{user()?.name}</div>
|
||||||
|
</Show>
|
||||||
|
|
||||||
|
// Resource properties
|
||||||
|
user.loading // boolean
|
||||||
|
user.error // error if failed
|
||||||
|
user.state // "unresolved" | "pending" | "ready" | "refreshing" | "errored"
|
||||||
|
user.latest // last successful value
|
||||||
|
```
|
||||||
|
|
||||||
|
## Stores — Complex State
|
||||||
|
|
||||||
|
For nested objects/arrays with fine-grained updates:
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createStore } from "solid-js/store";
|
||||||
|
|
||||||
|
const [state, setState] = createStore({
|
||||||
|
user: { name: "John", age: 30 },
|
||||||
|
todos: []
|
||||||
|
});
|
||||||
|
|
||||||
|
// Path syntax updates
|
||||||
|
setState("user", "name", "Jane");
|
||||||
|
setState("todos", todos => [...todos, newTodo]);
|
||||||
|
setState("todos", 0, "completed", true);
|
||||||
|
|
||||||
|
// Produce for immer-like updates
|
||||||
|
import { produce } from "solid-js/store";
|
||||||
|
setState(produce(s => {
|
||||||
|
s.user.age++;
|
||||||
|
s.todos.push(newTodo);
|
||||||
|
}));
|
||||||
|
```
|
||||||
|
|
||||||
|
**Store utilities:**
|
||||||
|
- `produce` — Immer-like mutations
|
||||||
|
- `reconcile` — Diff and patch data (for API responses)
|
||||||
|
- `unwrap` — Get raw non-reactive object
|
||||||
|
|
||||||
|
## Components
|
||||||
|
|
||||||
|
### Basic Component
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Component } from "solid-js";
|
||||||
|
|
||||||
|
const MyComponent: Component<{ name: string }> = (props) => {
|
||||||
|
return <div>Hello, {props.name}</div>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### Props Handling
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { splitProps, mergeProps } from "solid-js";
|
||||||
|
|
||||||
|
// Default props
|
||||||
|
const merged = mergeProps({ size: "medium" }, props);
|
||||||
|
|
||||||
|
// Split props (for spreading)
|
||||||
|
const [local, others] = splitProps(props, ["class", "onClick"]);
|
||||||
|
return <button class={local.class} {...others} />;
|
||||||
|
```
|
||||||
|
|
||||||
|
**Props rules:**
|
||||||
|
- Props are reactive getters — don't destructure at top level
|
||||||
|
- Use `props.value` in JSX, not `const { value } = props`
|
||||||
|
|
||||||
|
### Children Helper
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { children } from "solid-js";
|
||||||
|
|
||||||
|
const Wrapper: Component = (props) => {
|
||||||
|
const resolved = children(() => props.children);
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
console.log("Children:", resolved());
|
||||||
|
});
|
||||||
|
|
||||||
|
return <div>{resolved()}</div>;
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Control Flow Components
|
||||||
|
|
||||||
|
### Show — Conditional Rendering
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Show } from "solid-js";
|
||||||
|
|
||||||
|
<Show when={user()} fallback={<Login />}>
|
||||||
|
{(user) => <Profile user={user()} />}
|
||||||
|
</Show>
|
||||||
|
```
|
||||||
|
|
||||||
|
### For — List Rendering (keyed by reference)
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { For } from "solid-js";
|
||||||
|
|
||||||
|
<For each={items()} fallback={<Empty />}>
|
||||||
|
{(item, index) => (
|
||||||
|
<div>{index()}: {item.name}</div>
|
||||||
|
)}
|
||||||
|
</For>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note:** `index` is a signal, `item` is the value.
|
||||||
|
|
||||||
|
### Index — List Rendering (keyed by index)
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Index } from "solid-js";
|
||||||
|
|
||||||
|
<Index each={items()}>
|
||||||
|
{(item, index) => (
|
||||||
|
<input value={item().text} />
|
||||||
|
)}
|
||||||
|
</Index>
|
||||||
|
```
|
||||||
|
|
||||||
|
**Note:** `item` is a signal, `index` is the value. Better for primitive arrays or inputs.
|
||||||
|
|
||||||
|
### Switch/Match — Multiple Conditions
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Switch, Match } from "solid-js";
|
||||||
|
|
||||||
|
<Switch fallback={<Default />}>
|
||||||
|
<Match when={state() === "loading"}>
|
||||||
|
<Loading />
|
||||||
|
</Match>
|
||||||
|
<Match when={state() === "error"}>
|
||||||
|
<Error />
|
||||||
|
</Match>
|
||||||
|
<Match when={state() === "success"}>
|
||||||
|
<Success />
|
||||||
|
</Match>
|
||||||
|
</Switch>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dynamic — Dynamic Component
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Dynamic } from "solid-js/web";
|
||||||
|
|
||||||
|
<Dynamic component={selected()} someProp="value" />
|
||||||
|
```
|
||||||
|
|
||||||
|
### Portal — Render Outside DOM Hierarchy
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Portal } from "solid-js/web";
|
||||||
|
|
||||||
|
<Portal mount={document.body}>
|
||||||
|
<Modal />
|
||||||
|
</Portal>
|
||||||
|
```
|
||||||
|
|
||||||
|
### ErrorBoundary — Error Handling
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { ErrorBoundary } from "solid-js";
|
||||||
|
|
||||||
|
<ErrorBoundary fallback={(err, reset) => (
|
||||||
|
<div>
|
||||||
|
Error: {err.message}
|
||||||
|
<button onClick={reset}>Retry</button>
|
||||||
|
</div>
|
||||||
|
)}>
|
||||||
|
<RiskyComponent />
|
||||||
|
</ErrorBoundary>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Suspense — Async Loading
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Suspense } from "solid-js";
|
||||||
|
|
||||||
|
<Suspense fallback={<Loading />}>
|
||||||
|
<AsyncComponent />
|
||||||
|
</Suspense>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Context API
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createContext, useContext } from "solid-js";
|
||||||
|
|
||||||
|
// Create context
|
||||||
|
const CounterContext = createContext<{
|
||||||
|
count: () => number;
|
||||||
|
increment: () => void;
|
||||||
|
}>();
|
||||||
|
|
||||||
|
// Provider component
|
||||||
|
export function CounterProvider(props) {
|
||||||
|
const [count, setCount] = createSignal(0);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<CounterContext.Provider value={{
|
||||||
|
count,
|
||||||
|
increment: () => setCount(c => c + 1)
|
||||||
|
}}>
|
||||||
|
{props.children}
|
||||||
|
</CounterContext.Provider>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
// Consumer hook
|
||||||
|
export function useCounter() {
|
||||||
|
const ctx = useContext(CounterContext);
|
||||||
|
if (!ctx) throw new Error("useCounter must be used within CounterProvider");
|
||||||
|
return ctx;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Lifecycle
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { onMount, onCleanup } from "solid-js";
|
||||||
|
|
||||||
|
function MyComponent() {
|
||||||
|
onMount(() => {
|
||||||
|
console.log("Mounted");
|
||||||
|
const handler = () => {};
|
||||||
|
window.addEventListener("resize", handler);
|
||||||
|
|
||||||
|
onCleanup(() => {
|
||||||
|
window.removeEventListener("resize", handler);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
return <div>Content</div>;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Refs
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
let inputRef: HTMLInputElement;
|
||||||
|
|
||||||
|
<input ref={inputRef} />
|
||||||
|
<input ref={(el) => { /* el is the DOM element */ }} />
|
||||||
|
```
|
||||||
|
|
||||||
|
## Event Handling
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
// Standard events (lowercase)
|
||||||
|
<button onClick={handleClick}>Click</button>
|
||||||
|
<button onClick={(e) => handleClick(e)}>Click</button>
|
||||||
|
|
||||||
|
// Delegated events (on:)
|
||||||
|
<input on:input={handleInput} />
|
||||||
|
|
||||||
|
// Native events (on:) - not delegated
|
||||||
|
<div on:scroll={handleScroll} />
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Patterns
|
||||||
|
|
||||||
|
### Conditional Classes
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { clsx } from "clsx"; // or classList
|
||||||
|
|
||||||
|
<div class={clsx("base", { active: isActive() })} />
|
||||||
|
<div classList={{ active: isActive(), disabled: isDisabled() }} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### Batch Updates
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { batch } from "solid-js";
|
||||||
|
|
||||||
|
batch(() => {
|
||||||
|
setName("John");
|
||||||
|
setAge(30);
|
||||||
|
// Effects run once after batch completes
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Untrack
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { untrack } from "solid-js";
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
console.log(count()); // tracked
|
||||||
|
console.log(untrack(() => other())); // not tracked
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## TypeScript
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import type { Component, ParentComponent, JSX } from "solid-js";
|
||||||
|
|
||||||
|
// Basic component
|
||||||
|
const Button: Component<{ label: string }> = (props) => (
|
||||||
|
<button>{props.label}</button>
|
||||||
|
);
|
||||||
|
|
||||||
|
// With children
|
||||||
|
const Layout: ParentComponent<{ title: string }> = (props) => (
|
||||||
|
<div>
|
||||||
|
<h1>{props.title}</h1>
|
||||||
|
{props.children}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
|
||||||
|
// Event handler types
|
||||||
|
const handleClick: JSX.EventHandler<HTMLButtonElement, MouseEvent> = (e) => {
|
||||||
|
console.log(e.currentTarget);
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Project Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create new project
|
||||||
|
npm create solid@latest my-app
|
||||||
|
|
||||||
|
# With template
|
||||||
|
npx degit solidjs/templates/ts my-app
|
||||||
|
|
||||||
|
# SolidStart
|
||||||
|
npm create solid@latest my-app -- --template solidstart
|
||||||
|
```
|
||||||
|
|
||||||
|
**vite.config.ts:**
|
||||||
|
```ts
|
||||||
|
import { defineConfig } from "vite";
|
||||||
|
import solid from "vite-plugin-solid";
|
||||||
|
|
||||||
|
export default defineConfig({
|
||||||
|
plugins: [solid()]
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Anti-Patterns to Avoid
|
||||||
|
|
||||||
|
1. **Destructuring props** — Breaks reactivity
|
||||||
|
```tsx
|
||||||
|
// ❌ Bad
|
||||||
|
const { name } = props;
|
||||||
|
|
||||||
|
// ✅ Good
|
||||||
|
props.name
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Accessing signals outside tracking scope**
|
||||||
|
```tsx
|
||||||
|
// ❌ Won't update
|
||||||
|
console.log(count());
|
||||||
|
|
||||||
|
// ✅ Will update
|
||||||
|
createEffect(() => console.log(count()));
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Forgetting to call signal getters**
|
||||||
|
```tsx
|
||||||
|
// ❌ Passes the function
|
||||||
|
<div>{count}</div>
|
||||||
|
|
||||||
|
// ✅ Passes the value
|
||||||
|
<div>{count()}</div>
|
||||||
|
```
|
||||||
|
|
||||||
|
4. **Using array index as key** — Use `<For>` for reference-keyed, `<Index>` for index-keyed
|
||||||
|
|
||||||
|
5. **Side effects during render** — Use `createEffect` or `onMount`
|
||||||
777
profiles/opencode/skill/solidjs/references/api_reference.md
Normal file
777
profiles/opencode/skill/solidjs/references/api_reference.md
Normal file
@@ -0,0 +1,777 @@
|
|||||||
|
# SolidJS API Reference
|
||||||
|
|
||||||
|
Complete reference for all SolidJS primitives, utilities, and component APIs.
|
||||||
|
|
||||||
|
## Basic Reactivity
|
||||||
|
|
||||||
|
### createSignal
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createSignal } from "solid-js";
|
||||||
|
|
||||||
|
const [getter, setter] = createSignal<T>(initialValue, options?);
|
||||||
|
|
||||||
|
// Options
|
||||||
|
interface SignalOptions<T> {
|
||||||
|
equals?: false | ((prev: T, next: T) => boolean);
|
||||||
|
name?: string;
|
||||||
|
internal?: boolean;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
```tsx
|
||||||
|
const [count, setCount] = createSignal(0);
|
||||||
|
const [user, setUser] = createSignal<User | null>(null);
|
||||||
|
|
||||||
|
// Always update
|
||||||
|
const [data, setData] = createSignal(obj, { equals: false });
|
||||||
|
|
||||||
|
// Custom equality
|
||||||
|
const [items, setItems] = createSignal([], {
|
||||||
|
equals: (a, b) => a.length === b.length
|
||||||
|
});
|
||||||
|
|
||||||
|
// Setter forms
|
||||||
|
setCount(5); // Direct value
|
||||||
|
setCount(prev => prev + 1); // Functional update
|
||||||
|
```
|
||||||
|
|
||||||
|
### createEffect
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createEffect } from "solid-js";
|
||||||
|
|
||||||
|
createEffect<T>(fn: (prev: T) => T, initialValue?: T, options?);
|
||||||
|
|
||||||
|
// Options
|
||||||
|
interface EffectOptions {
|
||||||
|
name?: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
```tsx
|
||||||
|
// Basic
|
||||||
|
createEffect(() => {
|
||||||
|
console.log("Count:", count());
|
||||||
|
});
|
||||||
|
|
||||||
|
// With previous value
|
||||||
|
createEffect((prev) => {
|
||||||
|
console.log("Changed from", prev, "to", count());
|
||||||
|
return count();
|
||||||
|
}, count());
|
||||||
|
|
||||||
|
// With cleanup
|
||||||
|
createEffect(() => {
|
||||||
|
const handler = () => {};
|
||||||
|
window.addEventListener("resize", handler);
|
||||||
|
onCleanup(() => window.removeEventListener("resize", handler));
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### createMemo
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createMemo } from "solid-js";
|
||||||
|
|
||||||
|
const getter = createMemo<T>(fn: (prev: T) => T, initialValue?: T, options?);
|
||||||
|
|
||||||
|
// Options
|
||||||
|
interface MemoOptions<T> {
|
||||||
|
equals?: false | ((prev: T, next: T) => boolean);
|
||||||
|
name?: string;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
```tsx
|
||||||
|
const doubled = createMemo(() => count() * 2);
|
||||||
|
const filtered = createMemo(() => items().filter(i => i.active));
|
||||||
|
|
||||||
|
// Previous value
|
||||||
|
const delta = createMemo((prev) => count() - prev, 0);
|
||||||
|
```
|
||||||
|
|
||||||
|
### createResource
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createResource } from "solid-js";
|
||||||
|
|
||||||
|
const [resource, { mutate, refetch }] = createResource(
|
||||||
|
source?, // Optional reactive source
|
||||||
|
fetcher, // (source, info) => Promise<T>
|
||||||
|
options?
|
||||||
|
);
|
||||||
|
|
||||||
|
// Resource properties
|
||||||
|
resource() // T | undefined
|
||||||
|
resource.loading // boolean
|
||||||
|
resource.error // any
|
||||||
|
resource.state // "unresolved" | "pending" | "ready" | "refreshing" | "errored"
|
||||||
|
resource.latest // T | undefined (last successful value)
|
||||||
|
|
||||||
|
// Options
|
||||||
|
interface ResourceOptions<T> {
|
||||||
|
initialValue?: T;
|
||||||
|
name?: string;
|
||||||
|
deferStream?: boolean;
|
||||||
|
ssrLoadFrom?: "initial" | "server";
|
||||||
|
storage?: (init: T) => [Accessor<T>, Setter<T>];
|
||||||
|
onHydrated?: (key, info: { value: T }) => void;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Examples:**
|
||||||
|
```tsx
|
||||||
|
// Without source
|
||||||
|
const [users] = createResource(fetchUsers);
|
||||||
|
|
||||||
|
// With source
|
||||||
|
const [user] = createResource(userId, fetchUser);
|
||||||
|
|
||||||
|
// With options
|
||||||
|
const [data] = createResource(id, fetchData, {
|
||||||
|
initialValue: [],
|
||||||
|
deferStream: true
|
||||||
|
});
|
||||||
|
|
||||||
|
// Actions
|
||||||
|
mutate(newValue); // Update locally
|
||||||
|
refetch(); // Re-fetch
|
||||||
|
refetch(customInfo); // Pass to fetcher's info.refetching
|
||||||
|
```
|
||||||
|
|
||||||
|
## Stores
|
||||||
|
|
||||||
|
### createStore
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createStore } from "solid-js/store";
|
||||||
|
|
||||||
|
const [store, setStore] = createStore<T>(initialValue);
|
||||||
|
```
|
||||||
|
|
||||||
|
**Update patterns:**
|
||||||
|
```tsx
|
||||||
|
const [state, setState] = createStore({
|
||||||
|
user: { name: "John", age: 30 },
|
||||||
|
todos: [{ id: 1, text: "Learn Solid", done: false }]
|
||||||
|
});
|
||||||
|
|
||||||
|
// Path syntax
|
||||||
|
setState("user", "name", "Jane");
|
||||||
|
setState("user", "age", a => a + 1);
|
||||||
|
setState("todos", 0, "done", true);
|
||||||
|
|
||||||
|
// Array operations
|
||||||
|
setState("todos", t => [...t, newTodo]);
|
||||||
|
setState("todos", todos.length, newTodo);
|
||||||
|
|
||||||
|
// Multiple paths
|
||||||
|
setState("todos", { from: 0, to: 2 }, "done", true);
|
||||||
|
setState("todos", [0, 2, 4], "done", true);
|
||||||
|
setState("todos", i => i.done, "done", false);
|
||||||
|
|
||||||
|
// Object merge (shallow)
|
||||||
|
setState("user", { age: 31 }); // Keeps other properties
|
||||||
|
```
|
||||||
|
|
||||||
|
### produce
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { produce } from "solid-js/store";
|
||||||
|
|
||||||
|
setState(produce(draft => {
|
||||||
|
draft.user.age++;
|
||||||
|
draft.todos.push({ id: 2, text: "New", done: false });
|
||||||
|
draft.todos[0].done = true;
|
||||||
|
}));
|
||||||
|
```
|
||||||
|
|
||||||
|
### reconcile
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { reconcile } from "solid-js/store";
|
||||||
|
|
||||||
|
// Replace with diff (minimal updates)
|
||||||
|
setState("todos", reconcile(newTodosFromAPI));
|
||||||
|
|
||||||
|
// Options
|
||||||
|
reconcile(data, { key: "id", merge: true });
|
||||||
|
```
|
||||||
|
|
||||||
|
### unwrap
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { unwrap } from "solid-js/store";
|
||||||
|
|
||||||
|
const raw = unwrap(store); // Non-reactive plain object
|
||||||
|
```
|
||||||
|
|
||||||
|
### createMutable
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createMutable } from "solid-js/store";
|
||||||
|
|
||||||
|
const state = createMutable({
|
||||||
|
count: 0,
|
||||||
|
user: { name: "John" }
|
||||||
|
});
|
||||||
|
|
||||||
|
// Direct mutation (like MobX)
|
||||||
|
state.count++;
|
||||||
|
state.user.name = "Jane";
|
||||||
|
```
|
||||||
|
|
||||||
|
### modifyMutable
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { modifyMutable, reconcile, produce } from "solid-js/store";
|
||||||
|
|
||||||
|
modifyMutable(state, reconcile(newData));
|
||||||
|
modifyMutable(state, produce(s => { s.count++ }));
|
||||||
|
```
|
||||||
|
|
||||||
|
## Component APIs
|
||||||
|
|
||||||
|
### children
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { children } from "solid-js";
|
||||||
|
|
||||||
|
const resolved = children(() => props.children);
|
||||||
|
|
||||||
|
// Access
|
||||||
|
resolved(); // JSX.Element | JSX.Element[]
|
||||||
|
resolved.toArray(); // Always array
|
||||||
|
```
|
||||||
|
|
||||||
|
### createContext / useContext
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createContext, useContext } from "solid-js";
|
||||||
|
|
||||||
|
const MyContext = createContext<T>(defaultValue?);
|
||||||
|
|
||||||
|
// Provider
|
||||||
|
<MyContext.Provider value={value}>
|
||||||
|
{children}
|
||||||
|
</MyContext.Provider>
|
||||||
|
|
||||||
|
// Consumer
|
||||||
|
const value = useContext(MyContext);
|
||||||
|
```
|
||||||
|
|
||||||
|
### createUniqueId
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createUniqueId } from "solid-js";
|
||||||
|
|
||||||
|
const id = createUniqueId(); // "0", "1", etc.
|
||||||
|
```
|
||||||
|
|
||||||
|
### lazy
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { lazy } from "solid-js";
|
||||||
|
|
||||||
|
const LazyComponent = lazy(() => import("./Component"));
|
||||||
|
|
||||||
|
// Use with Suspense
|
||||||
|
<Suspense fallback={<Loading />}>
|
||||||
|
<LazyComponent />
|
||||||
|
</Suspense>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Lifecycle
|
||||||
|
|
||||||
|
### onMount
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { onMount } from "solid-js";
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
// Runs once after initial render
|
||||||
|
console.log("Mounted");
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### onCleanup
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { onCleanup } from "solid-js";
|
||||||
|
|
||||||
|
// In component
|
||||||
|
onCleanup(() => {
|
||||||
|
console.log("Cleaning up");
|
||||||
|
});
|
||||||
|
|
||||||
|
// In effect
|
||||||
|
createEffect(() => {
|
||||||
|
const sub = subscribe();
|
||||||
|
onCleanup(() => sub.unsubscribe());
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Reactive Utilities
|
||||||
|
|
||||||
|
### batch
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { batch } from "solid-js";
|
||||||
|
|
||||||
|
batch(() => {
|
||||||
|
setA(1);
|
||||||
|
setB(2);
|
||||||
|
setC(3);
|
||||||
|
// Effects run once after batch
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### untrack
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { untrack } from "solid-js";
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
console.log(a()); // Tracked
|
||||||
|
console.log(untrack(() => b())); // Not tracked
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### on
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { on } from "solid-js";
|
||||||
|
|
||||||
|
// Explicit dependencies
|
||||||
|
createEffect(on(count, (value, prev) => {
|
||||||
|
console.log("Count changed:", prev, "->", value);
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Multiple dependencies
|
||||||
|
createEffect(on([a, b], ([a, b], [prevA, prevB]) => {
|
||||||
|
console.log("Changed");
|
||||||
|
}));
|
||||||
|
|
||||||
|
// Defer first run
|
||||||
|
createEffect(on(count, (v) => console.log(v), { defer: true }));
|
||||||
|
```
|
||||||
|
|
||||||
|
### mergeProps
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { mergeProps } from "solid-js";
|
||||||
|
|
||||||
|
const merged = mergeProps(
|
||||||
|
{ size: "medium", color: "blue" }, // Defaults
|
||||||
|
props // Overrides
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### splitProps
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { splitProps } from "solid-js";
|
||||||
|
|
||||||
|
const [local, others] = splitProps(props, ["class", "onClick"]);
|
||||||
|
// local.class, local.onClick
|
||||||
|
// others contains everything else
|
||||||
|
|
||||||
|
const [a, b, rest] = splitProps(props, ["foo"], ["bar"]);
|
||||||
|
```
|
||||||
|
|
||||||
|
### createRoot
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createRoot } from "solid-js";
|
||||||
|
|
||||||
|
const dispose = createRoot((dispose) => {
|
||||||
|
const [count, setCount] = createSignal(0);
|
||||||
|
// Use signals...
|
||||||
|
return dispose;
|
||||||
|
});
|
||||||
|
|
||||||
|
// Later
|
||||||
|
dispose();
|
||||||
|
```
|
||||||
|
|
||||||
|
### getOwner / runWithOwner
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { getOwner, runWithOwner } from "solid-js";
|
||||||
|
|
||||||
|
const owner = getOwner();
|
||||||
|
|
||||||
|
// Later, in async code
|
||||||
|
runWithOwner(owner, () => {
|
||||||
|
createEffect(() => {
|
||||||
|
// This effect has proper ownership
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### mapArray
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { mapArray } from "solid-js";
|
||||||
|
|
||||||
|
const mapped = mapArray(
|
||||||
|
() => items(),
|
||||||
|
(item, index) => ({ ...item, doubled: item.value * 2 })
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### indexArray
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { indexArray } from "solid-js";
|
||||||
|
|
||||||
|
const mapped = indexArray(
|
||||||
|
() => items(),
|
||||||
|
(item, index) => <div>{index}: {item().name}</div>
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
### observable
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { observable } from "solid-js";
|
||||||
|
|
||||||
|
const obs = observable(signal);
|
||||||
|
obs.subscribe((value) => console.log(value));
|
||||||
|
```
|
||||||
|
|
||||||
|
### from
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { from } from "solid-js";
|
||||||
|
|
||||||
|
// Convert observable/subscribable to signal
|
||||||
|
const signal = from(rxObservable);
|
||||||
|
const signal = from((set) => {
|
||||||
|
const unsub = subscribe(set);
|
||||||
|
return unsub;
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### catchError
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { catchError } from "solid-js";
|
||||||
|
|
||||||
|
catchError(
|
||||||
|
() => riskyOperation(),
|
||||||
|
(err) => console.error("Error:", err)
|
||||||
|
);
|
||||||
|
```
|
||||||
|
|
||||||
|
## Secondary Primitives
|
||||||
|
|
||||||
|
### createComputed
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createComputed } from "solid-js";
|
||||||
|
|
||||||
|
// Like createEffect but runs during render phase
|
||||||
|
createComputed(() => {
|
||||||
|
setDerived(source() * 2);
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### createRenderEffect
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createRenderEffect } from "solid-js";
|
||||||
|
|
||||||
|
// Runs before paint (for DOM measurements)
|
||||||
|
createRenderEffect(() => {
|
||||||
|
const height = element.offsetHeight;
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### createDeferred
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createDeferred } from "solid-js";
|
||||||
|
|
||||||
|
// Returns value after idle time
|
||||||
|
const deferred = createDeferred(() => expensiveComputation(), {
|
||||||
|
timeoutMs: 1000
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### createReaction
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createReaction } from "solid-js";
|
||||||
|
|
||||||
|
const track = createReaction(() => {
|
||||||
|
console.log("Something changed");
|
||||||
|
});
|
||||||
|
|
||||||
|
track(() => count()); // Start tracking
|
||||||
|
```
|
||||||
|
|
||||||
|
### createSelector
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { createSelector } from "solid-js";
|
||||||
|
|
||||||
|
const isSelected = createSelector(selectedId);
|
||||||
|
|
||||||
|
<For each={items()}>
|
||||||
|
{(item) => (
|
||||||
|
<div class={isSelected(item.id) ? "selected" : ""}>
|
||||||
|
{item.name}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</For>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Components
|
||||||
|
|
||||||
|
### Show
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<Show when={condition()} fallback={<Fallback />}>
|
||||||
|
<Content />
|
||||||
|
</Show>
|
||||||
|
|
||||||
|
// With callback (narrowed type)
|
||||||
|
<Show when={user()}>
|
||||||
|
{(user) => <div>{user().name}</div>}
|
||||||
|
</Show>
|
||||||
|
```
|
||||||
|
|
||||||
|
### For
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<For each={items()} fallback={<Empty />}>
|
||||||
|
{(item, index) => <div>{index()}: {item.name}</div>}
|
||||||
|
</For>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Index
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<Index each={items()} fallback={<Empty />}>
|
||||||
|
{(item, index) => <input value={item().text} />}
|
||||||
|
</Index>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Switch / Match
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<Switch fallback={<Default />}>
|
||||||
|
<Match when={state() === "loading"}>
|
||||||
|
<Loading />
|
||||||
|
</Match>
|
||||||
|
<Match when={state() === "error"}>
|
||||||
|
<Error />
|
||||||
|
</Match>
|
||||||
|
</Switch>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Dynamic
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Dynamic } from "solid-js/web";
|
||||||
|
|
||||||
|
<Dynamic component={selected()} prop={value} />
|
||||||
|
<Dynamic component="div" class="dynamic">Content</Dynamic>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Portal
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { Portal } from "solid-js/web";
|
||||||
|
|
||||||
|
<Portal mount={document.body}>
|
||||||
|
<Modal />
|
||||||
|
</Portal>
|
||||||
|
```
|
||||||
|
|
||||||
|
### ErrorBoundary
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<ErrorBoundary fallback={(err, reset) => (
|
||||||
|
<div>
|
||||||
|
<p>Error: {err.message}</p>
|
||||||
|
<button onClick={reset}>Retry</button>
|
||||||
|
</div>
|
||||||
|
)}>
|
||||||
|
<Content />
|
||||||
|
</ErrorBoundary>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Suspense
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<Suspense fallback={<Loading />}>
|
||||||
|
<AsyncContent />
|
||||||
|
</Suspense>
|
||||||
|
```
|
||||||
|
|
||||||
|
### SuspenseList
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<SuspenseList revealOrder="forwards" tail="collapsed">
|
||||||
|
<Suspense fallback={<Loading />}><Item1 /></Suspense>
|
||||||
|
<Suspense fallback={<Loading />}><Item2 /></Suspense>
|
||||||
|
<Suspense fallback={<Loading />}><Item3 /></Suspense>
|
||||||
|
</SuspenseList>
|
||||||
|
```
|
||||||
|
|
||||||
|
## Rendering
|
||||||
|
|
||||||
|
### render
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { render } from "solid-js/web";
|
||||||
|
|
||||||
|
const dispose = render(() => <App />, document.getElementById("root")!);
|
||||||
|
|
||||||
|
// Cleanup
|
||||||
|
dispose();
|
||||||
|
```
|
||||||
|
|
||||||
|
### hydrate
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { hydrate } from "solid-js/web";
|
||||||
|
|
||||||
|
hydrate(() => <App />, document.getElementById("root")!);
|
||||||
|
```
|
||||||
|
|
||||||
|
### renderToString
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { renderToString } from "solid-js/web";
|
||||||
|
|
||||||
|
const html = renderToString(() => <App />);
|
||||||
|
```
|
||||||
|
|
||||||
|
### renderToStringAsync
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { renderToStringAsync } from "solid-js/web";
|
||||||
|
|
||||||
|
const html = await renderToStringAsync(() => <App />);
|
||||||
|
```
|
||||||
|
|
||||||
|
### renderToStream
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { renderToStream } from "solid-js/web";
|
||||||
|
|
||||||
|
const stream = renderToStream(() => <App />);
|
||||||
|
stream.pipe(res);
|
||||||
|
```
|
||||||
|
|
||||||
|
### isServer
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { isServer } from "solid-js/web";
|
||||||
|
|
||||||
|
if (isServer) {
|
||||||
|
// Server-only code
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## JSX Attributes
|
||||||
|
|
||||||
|
### ref
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
let el: HTMLDivElement;
|
||||||
|
<div ref={el} />
|
||||||
|
<div ref={(e) => console.log(e)} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### classList
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<div classList={{ active: isActive(), disabled: isDisabled() }} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### style
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<div style={{ color: "red", "font-size": "14px" }} />
|
||||||
|
<div style={`color: ${color()}`} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### on:event (native)
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<div on:click={handleClick} />
|
||||||
|
<div on:scroll={handleScroll} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### use:directive
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function clickOutside(el: HTMLElement, accessor: () => () => void) {
|
||||||
|
const handler = (e: MouseEvent) => {
|
||||||
|
if (!el.contains(e.target as Node)) accessor()();
|
||||||
|
};
|
||||||
|
document.addEventListener("click", handler);
|
||||||
|
onCleanup(() => document.removeEventListener("click", handler));
|
||||||
|
}
|
||||||
|
|
||||||
|
<div use:clickOutside={() => setOpen(false)} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### prop:property
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<input prop:value={value()} /> // Set as property, not attribute
|
||||||
|
```
|
||||||
|
|
||||||
|
### attr:attribute
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<div attr:data-custom={value()} /> // Force attribute
|
||||||
|
```
|
||||||
|
|
||||||
|
### bool:attribute
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<input bool:disabled={isDisabled()} />
|
||||||
|
```
|
||||||
|
|
||||||
|
### @once
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
<div title={/*@once*/ staticValue} /> // Never updates
|
||||||
|
```
|
||||||
|
|
||||||
|
## Types
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import type {
|
||||||
|
Component,
|
||||||
|
ParentComponent,
|
||||||
|
FlowComponent,
|
||||||
|
VoidComponent,
|
||||||
|
JSX,
|
||||||
|
Accessor,
|
||||||
|
Setter,
|
||||||
|
Signal,
|
||||||
|
Resource,
|
||||||
|
Owner
|
||||||
|
} from "solid-js";
|
||||||
|
|
||||||
|
// Component types
|
||||||
|
const MyComponent: Component<Props> = (props) => <div />;
|
||||||
|
const Parent: ParentComponent<Props> = (props) => <div>{props.children}</div>;
|
||||||
|
const Flow: FlowComponent<Props, Item> = (props) => props.children(item);
|
||||||
|
const Void: VoidComponent<Props> = (props) => <input />;
|
||||||
|
|
||||||
|
// Event types
|
||||||
|
type Handler = JSX.EventHandler<HTMLButtonElement, MouseEvent>;
|
||||||
|
type ChangeHandler = JSX.ChangeEventHandler<HTMLInputElement>;
|
||||||
|
```
|
||||||
720
profiles/opencode/skill/solidjs/references/patterns.md
Normal file
720
profiles/opencode/skill/solidjs/references/patterns.md
Normal file
@@ -0,0 +1,720 @@
|
|||||||
|
# SolidJS Patterns & Best Practices
|
||||||
|
|
||||||
|
Common patterns, recipes, and best practices for SolidJS development.
|
||||||
|
|
||||||
|
## Component Patterns
|
||||||
|
|
||||||
|
### Controlled vs Uncontrolled Inputs
|
||||||
|
|
||||||
|
**Controlled:**
|
||||||
|
```tsx
|
||||||
|
function ControlledInput() {
|
||||||
|
const [value, setValue] = createSignal("");
|
||||||
|
|
||||||
|
return (
|
||||||
|
<input
|
||||||
|
value={value()}
|
||||||
|
onInput={(e) => setValue(e.currentTarget.value)}
|
||||||
|
/>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Uncontrolled with ref:**
|
||||||
|
```tsx
|
||||||
|
function UncontrolledInput() {
|
||||||
|
let inputRef: HTMLInputElement;
|
||||||
|
|
||||||
|
const handleSubmit = () => {
|
||||||
|
console.log(inputRef.value);
|
||||||
|
};
|
||||||
|
|
||||||
|
return (
|
||||||
|
<>
|
||||||
|
<input ref={inputRef!} />
|
||||||
|
<button onClick={handleSubmit}>Submit</button>
|
||||||
|
</>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Compound Components
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
const Tabs = {
|
||||||
|
Root: (props: ParentProps<{ defaultTab?: string }>) => {
|
||||||
|
const [activeTab, setActiveTab] = createSignal(props.defaultTab ?? "");
|
||||||
|
|
||||||
|
return (
|
||||||
|
<TabsContext.Provider value={{ activeTab, setActiveTab }}>
|
||||||
|
<div class="tabs">{props.children}</div>
|
||||||
|
</TabsContext.Provider>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
List: (props: ParentProps) => (
|
||||||
|
<div class="tabs-list" role="tablist">{props.children}</div>
|
||||||
|
),
|
||||||
|
|
||||||
|
Tab: (props: ParentProps<{ value: string }>) => {
|
||||||
|
const ctx = useTabsContext();
|
||||||
|
return (
|
||||||
|
<button
|
||||||
|
role="tab"
|
||||||
|
aria-selected={ctx.activeTab() === props.value}
|
||||||
|
onClick={() => ctx.setActiveTab(props.value)}
|
||||||
|
>
|
||||||
|
{props.children}
|
||||||
|
</button>
|
||||||
|
);
|
||||||
|
},
|
||||||
|
|
||||||
|
Panel: (props: ParentProps<{ value: string }>) => {
|
||||||
|
const ctx = useTabsContext();
|
||||||
|
return (
|
||||||
|
<Show when={ctx.activeTab() === props.value}>
|
||||||
|
<div role="tabpanel">{props.children}</div>
|
||||||
|
</Show>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
<Tabs.Root defaultTab="first">
|
||||||
|
<Tabs.List>
|
||||||
|
<Tabs.Tab value="first">First</Tabs.Tab>
|
||||||
|
<Tabs.Tab value="second">Second</Tabs.Tab>
|
||||||
|
</Tabs.List>
|
||||||
|
<Tabs.Panel value="first">First Content</Tabs.Panel>
|
||||||
|
<Tabs.Panel value="second">Second Content</Tabs.Panel>
|
||||||
|
</Tabs.Root>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Render Props
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function MouseTracker(props: {
|
||||||
|
children: (pos: { x: number; y: number }) => JSX.Element;
|
||||||
|
}) {
|
||||||
|
const [pos, setPos] = createSignal({ x: 0, y: 0 });
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
const handler = (e: MouseEvent) => setPos({ x: e.clientX, y: e.clientY });
|
||||||
|
window.addEventListener("mousemove", handler);
|
||||||
|
onCleanup(() => window.removeEventListener("mousemove", handler));
|
||||||
|
});
|
||||||
|
|
||||||
|
return <>{props.children(pos())}</>;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
<MouseTracker>
|
||||||
|
{(pos) => <div>Mouse: {pos.x}, {pos.y}</div>}
|
||||||
|
</MouseTracker>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Higher-Order Components
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function withAuth<P extends object>(Component: Component<P>) {
|
||||||
|
return (props: P) => {
|
||||||
|
const { user } = useAuth();
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Show when={user()} fallback={<Redirect to="/login" />}>
|
||||||
|
<Component {...props} />
|
||||||
|
</Show>
|
||||||
|
);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
const ProtectedDashboard = withAuth(Dashboard);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Polymorphic Components
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
type PolymorphicProps<E extends keyof JSX.IntrinsicElements> = {
|
||||||
|
as?: E;
|
||||||
|
} & JSX.IntrinsicElements[E];
|
||||||
|
|
||||||
|
function Box<E extends keyof JSX.IntrinsicElements = "div">(
|
||||||
|
props: PolymorphicProps<E>
|
||||||
|
) {
|
||||||
|
const [local, others] = splitProps(props as PolymorphicProps<"div">, ["as"]);
|
||||||
|
|
||||||
|
return <Dynamic component={local.as || "div"} {...others} />;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
<Box>Default div</Box>
|
||||||
|
<Box as="section">Section element</Box>
|
||||||
|
<Box as="button" onClick={handleClick}>Button</Box>
|
||||||
|
```
|
||||||
|
|
||||||
|
## State Patterns
|
||||||
|
|
||||||
|
### Derived State with Multiple Sources
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function SearchResults() {
|
||||||
|
const [query, setQuery] = createSignal("");
|
||||||
|
const [filters, setFilters] = createSignal({ category: "all" });
|
||||||
|
|
||||||
|
const results = createMemo(() => {
|
||||||
|
const q = query().toLowerCase();
|
||||||
|
const f = filters();
|
||||||
|
|
||||||
|
return allItems()
|
||||||
|
.filter(item => item.name.toLowerCase().includes(q))
|
||||||
|
.filter(item => f.category === "all" || item.category === f.category);
|
||||||
|
});
|
||||||
|
|
||||||
|
return <For each={results()}>{item => <Item item={item} />}</For>;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### State Machine Pattern
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
type State = "idle" | "loading" | "success" | "error";
|
||||||
|
type Event = { type: "FETCH" } | { type: "SUCCESS"; data: any } | { type: "ERROR"; error: Error };
|
||||||
|
|
||||||
|
function createMachine(initial: State) {
|
||||||
|
const [state, setState] = createSignal<State>(initial);
|
||||||
|
const [data, setData] = createSignal<any>(null);
|
||||||
|
const [error, setError] = createSignal<Error | null>(null);
|
||||||
|
|
||||||
|
const send = (event: Event) => {
|
||||||
|
const current = state();
|
||||||
|
|
||||||
|
switch (current) {
|
||||||
|
case "idle":
|
||||||
|
if (event.type === "FETCH") setState("loading");
|
||||||
|
break;
|
||||||
|
case "loading":
|
||||||
|
if (event.type === "SUCCESS") {
|
||||||
|
setData(event.data);
|
||||||
|
setState("success");
|
||||||
|
} else if (event.type === "ERROR") {
|
||||||
|
setError(event.error);
|
||||||
|
setState("error");
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
return { state, data, error, send };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Optimistic Updates
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
const [todos, setTodos] = createStore<Todo[]>([]);
|
||||||
|
|
||||||
|
async function deleteTodo(id: string) {
|
||||||
|
const original = [...unwrap(todos)];
|
||||||
|
|
||||||
|
// Optimistic remove
|
||||||
|
setTodos(todos => todos.filter(t => t.id !== id));
|
||||||
|
|
||||||
|
try {
|
||||||
|
await api.deleteTodo(id);
|
||||||
|
} catch {
|
||||||
|
// Rollback on error
|
||||||
|
setTodos(reconcile(original));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Undo/Redo
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createHistory<T>(initial: T) {
|
||||||
|
const [past, setPast] = createSignal<T[]>([]);
|
||||||
|
const [present, setPresent] = createSignal<T>(initial);
|
||||||
|
const [future, setFuture] = createSignal<T[]>([]);
|
||||||
|
|
||||||
|
const canUndo = () => past().length > 0;
|
||||||
|
const canRedo = () => future().length > 0;
|
||||||
|
|
||||||
|
const set = (value: T | ((prev: T) => T)) => {
|
||||||
|
const newValue = typeof value === "function"
|
||||||
|
? (value as (prev: T) => T)(present())
|
||||||
|
: value;
|
||||||
|
|
||||||
|
setPast(p => [...p, present()]);
|
||||||
|
setPresent(newValue);
|
||||||
|
setFuture([]);
|
||||||
|
};
|
||||||
|
|
||||||
|
const undo = () => {
|
||||||
|
if (!canUndo()) return;
|
||||||
|
|
||||||
|
const previous = past()[past().length - 1];
|
||||||
|
setPast(p => p.slice(0, -1));
|
||||||
|
setFuture(f => [present(), ...f]);
|
||||||
|
setPresent(previous);
|
||||||
|
};
|
||||||
|
|
||||||
|
const redo = () => {
|
||||||
|
if (!canRedo()) return;
|
||||||
|
|
||||||
|
const next = future()[0];
|
||||||
|
setPast(p => [...p, present()]);
|
||||||
|
setFuture(f => f.slice(1));
|
||||||
|
setPresent(next);
|
||||||
|
};
|
||||||
|
|
||||||
|
return { value: present, set, undo, redo, canUndo, canRedo };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Custom Hooks/Primitives
|
||||||
|
|
||||||
|
### useLocalStorage
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createLocalStorage<T>(key: string, initialValue: T) {
|
||||||
|
const stored = localStorage.getItem(key);
|
||||||
|
const initial = stored ? JSON.parse(stored) : initialValue;
|
||||||
|
|
||||||
|
const [value, setValue] = createSignal<T>(initial);
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
localStorage.setItem(key, JSON.stringify(value()));
|
||||||
|
});
|
||||||
|
|
||||||
|
return [value, setValue] as const;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### useDebounce
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createDebounce<T>(source: () => T, delay: number) {
|
||||||
|
const [debounced, setDebounced] = createSignal<T>(source());
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
const value = source();
|
||||||
|
const timer = setTimeout(() => setDebounced(() => value), delay);
|
||||||
|
onCleanup(() => clearTimeout(timer));
|
||||||
|
});
|
||||||
|
|
||||||
|
return debounced;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
const debouncedQuery = createDebounce(query, 300);
|
||||||
|
```
|
||||||
|
|
||||||
|
### useThrottle
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createThrottle<T>(source: () => T, delay: number) {
|
||||||
|
const [throttled, setThrottled] = createSignal<T>(source());
|
||||||
|
let lastRun = 0;
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
const value = source();
|
||||||
|
const now = Date.now();
|
||||||
|
|
||||||
|
if (now - lastRun >= delay) {
|
||||||
|
lastRun = now;
|
||||||
|
setThrottled(() => value);
|
||||||
|
} else {
|
||||||
|
const timer = setTimeout(() => {
|
||||||
|
lastRun = Date.now();
|
||||||
|
setThrottled(() => value);
|
||||||
|
}, delay - (now - lastRun));
|
||||||
|
onCleanup(() => clearTimeout(timer));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return throttled;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### useMediaQuery
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createMediaQuery(query: string) {
|
||||||
|
const mql = window.matchMedia(query);
|
||||||
|
const [matches, setMatches] = createSignal(mql.matches);
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
const handler = (e: MediaQueryListEvent) => setMatches(e.matches);
|
||||||
|
mql.addEventListener("change", handler);
|
||||||
|
onCleanup(() => mql.removeEventListener("change", handler));
|
||||||
|
});
|
||||||
|
|
||||||
|
return matches;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
const isMobile = createMediaQuery("(max-width: 768px)");
|
||||||
|
```
|
||||||
|
|
||||||
|
### useClickOutside
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createClickOutside(
|
||||||
|
ref: () => HTMLElement | undefined,
|
||||||
|
callback: () => void
|
||||||
|
) {
|
||||||
|
onMount(() => {
|
||||||
|
const handler = (e: MouseEvent) => {
|
||||||
|
const el = ref();
|
||||||
|
if (el && !el.contains(e.target as Node)) {
|
||||||
|
callback();
|
||||||
|
}
|
||||||
|
};
|
||||||
|
document.addEventListener("click", handler);
|
||||||
|
onCleanup(() => document.removeEventListener("click", handler));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
let dropdownRef: HTMLDivElement;
|
||||||
|
createClickOutside(() => dropdownRef, () => setOpen(false));
|
||||||
|
```
|
||||||
|
|
||||||
|
### useIntersectionObserver
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createIntersectionObserver(
|
||||||
|
ref: () => HTMLElement | undefined,
|
||||||
|
options?: IntersectionObserverInit
|
||||||
|
) {
|
||||||
|
const [isIntersecting, setIsIntersecting] = createSignal(false);
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
const el = ref();
|
||||||
|
if (!el) return;
|
||||||
|
|
||||||
|
const observer = new IntersectionObserver(([entry]) => {
|
||||||
|
setIsIntersecting(entry.isIntersecting);
|
||||||
|
}, options);
|
||||||
|
|
||||||
|
observer.observe(el);
|
||||||
|
onCleanup(() => observer.disconnect());
|
||||||
|
});
|
||||||
|
|
||||||
|
return isIntersecting;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Form Patterns
|
||||||
|
|
||||||
|
### Form Validation
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createForm<T extends Record<string, any>>(initial: T) {
|
||||||
|
const [values, setValues] = createStore<T>(initial);
|
||||||
|
const [errors, setErrors] = createStore<Partial<Record<keyof T, string>>>({});
|
||||||
|
const [touched, setTouched] = createStore<Partial<Record<keyof T, boolean>>>({});
|
||||||
|
|
||||||
|
const handleChange = (field: keyof T) => (e: Event) => {
|
||||||
|
const target = e.target as HTMLInputElement;
|
||||||
|
setValues(field as any, target.value as any);
|
||||||
|
};
|
||||||
|
|
||||||
|
const handleBlur = (field: keyof T) => () => {
|
||||||
|
setTouched(field as any, true);
|
||||||
|
};
|
||||||
|
|
||||||
|
const validate = (validators: Partial<Record<keyof T, (v: any) => string | undefined>>) => {
|
||||||
|
let isValid = true;
|
||||||
|
|
||||||
|
for (const [field, validator] of Object.entries(validators)) {
|
||||||
|
if (validator) {
|
||||||
|
const error = validator(values[field as keyof T]);
|
||||||
|
setErrors(field as any, error as any);
|
||||||
|
if (error) isValid = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return isValid;
|
||||||
|
};
|
||||||
|
|
||||||
|
return { values, errors, touched, handleChange, handleBlur, validate, setValues };
|
||||||
|
}
|
||||||
|
|
||||||
|
// Usage
|
||||||
|
const form = createForm({ email: "", password: "" });
|
||||||
|
|
||||||
|
<input
|
||||||
|
value={form.values.email}
|
||||||
|
onInput={form.handleChange("email")}
|
||||||
|
onBlur={form.handleBlur("email")}
|
||||||
|
/>
|
||||||
|
<Show when={form.touched.email && form.errors.email}>
|
||||||
|
<span class="error">{form.errors.email}</span>
|
||||||
|
</Show>
|
||||||
|
```
|
||||||
|
|
||||||
|
### Field Array
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function createFieldArray<T>(initial: T[] = []) {
|
||||||
|
const [fields, setFields] = createStore<T[]>(initial);
|
||||||
|
|
||||||
|
const append = (value: T) => setFields(f => [...f, value]);
|
||||||
|
const remove = (index: number) => setFields(f => f.filter((_, i) => i !== index));
|
||||||
|
const update = (index: number, value: Partial<T>) => setFields(index, v => ({ ...v, ...value }));
|
||||||
|
const move = (from: number, to: number) => {
|
||||||
|
setFields(produce(f => {
|
||||||
|
const [item] = f.splice(from, 1);
|
||||||
|
f.splice(to, 0, item);
|
||||||
|
}));
|
||||||
|
};
|
||||||
|
|
||||||
|
return { fields, append, remove, update, move };
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Performance Patterns
|
||||||
|
|
||||||
|
### Virtualized List
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function VirtualList<T>(props: {
|
||||||
|
items: T[];
|
||||||
|
itemHeight: number;
|
||||||
|
height: number;
|
||||||
|
renderItem: (item: T, index: number) => JSX.Element;
|
||||||
|
}) {
|
||||||
|
const [scrollTop, setScrollTop] = createSignal(0);
|
||||||
|
|
||||||
|
const startIndex = createMemo(() =>
|
||||||
|
Math.floor(scrollTop() / props.itemHeight)
|
||||||
|
);
|
||||||
|
|
||||||
|
const visibleCount = createMemo(() =>
|
||||||
|
Math.ceil(props.height / props.itemHeight) + 1
|
||||||
|
);
|
||||||
|
|
||||||
|
const visibleItems = createMemo(() =>
|
||||||
|
props.items.slice(startIndex(), startIndex() + visibleCount())
|
||||||
|
);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
style={{ height: `${props.height}px`, overflow: "auto" }}
|
||||||
|
onScroll={(e) => setScrollTop(e.currentTarget.scrollTop)}
|
||||||
|
>
|
||||||
|
<div style={{ height: `${props.items.length * props.itemHeight}px`, position: "relative" }}>
|
||||||
|
<For each={visibleItems()}>
|
||||||
|
{(item, i) => (
|
||||||
|
<div style={{
|
||||||
|
position: "absolute",
|
||||||
|
top: `${(startIndex() + i()) * props.itemHeight}px`,
|
||||||
|
height: `${props.itemHeight}px`
|
||||||
|
}}>
|
||||||
|
{props.renderItem(item, startIndex() + i())}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</For>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Lazy Loading with Intersection Observer
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function LazyLoad(props: ParentProps<{ placeholder?: JSX.Element }>) {
|
||||||
|
let ref: HTMLDivElement;
|
||||||
|
const [isVisible, setIsVisible] = createSignal(false);
|
||||||
|
|
||||||
|
onMount(() => {
|
||||||
|
const observer = new IntersectionObserver(
|
||||||
|
([entry]) => {
|
||||||
|
if (entry.isIntersecting) {
|
||||||
|
setIsVisible(true);
|
||||||
|
observer.disconnect();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
{ rootMargin: "100px" }
|
||||||
|
);
|
||||||
|
observer.observe(ref);
|
||||||
|
onCleanup(() => observer.disconnect());
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div ref={ref!}>
|
||||||
|
<Show when={isVisible()} fallback={props.placeholder}>
|
||||||
|
{props.children}
|
||||||
|
</Show>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Memoized Component
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
// For expensive components that shouldn't re-render on parent updates
|
||||||
|
function MemoizedExpensiveList(props: { items: Item[] }) {
|
||||||
|
// Component only re-renders when items actually change
|
||||||
|
return (
|
||||||
|
<For each={props.items}>
|
||||||
|
{(item) => <ExpensiveItem item={item} />}
|
||||||
|
</For>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Testing Patterns
|
||||||
|
|
||||||
|
### Component Testing
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { render, fireEvent, screen } from "@solidjs/testing-library";
|
||||||
|
|
||||||
|
test("Counter increments", async () => {
|
||||||
|
render(() => <Counter />);
|
||||||
|
|
||||||
|
const button = screen.getByRole("button", { name: /increment/i });
|
||||||
|
expect(screen.getByText("Count: 0")).toBeInTheDocument();
|
||||||
|
|
||||||
|
fireEvent.click(button);
|
||||||
|
expect(screen.getByText("Count: 1")).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing with Context
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function renderWithContext(component: () => JSX.Element) {
|
||||||
|
return render(() => (
|
||||||
|
<ThemeProvider>
|
||||||
|
<AuthProvider>
|
||||||
|
{component()}
|
||||||
|
</AuthProvider>
|
||||||
|
</ThemeProvider>
|
||||||
|
));
|
||||||
|
}
|
||||||
|
|
||||||
|
test("Dashboard shows user", () => {
|
||||||
|
renderWithContext(() => <Dashboard />);
|
||||||
|
// ...
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing Async Components
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
import { render, waitFor, screen } from "@solidjs/testing-library";
|
||||||
|
|
||||||
|
test("Loads user data", async () => {
|
||||||
|
render(() => <UserProfile userId="123" />);
|
||||||
|
|
||||||
|
expect(screen.getByText(/loading/i)).toBeInTheDocument();
|
||||||
|
|
||||||
|
await waitFor(() => {
|
||||||
|
expect(screen.getByText("John Doe")).toBeInTheDocument();
|
||||||
|
});
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Error Handling Patterns
|
||||||
|
|
||||||
|
### Global Error Handler
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function App() {
|
||||||
|
return (
|
||||||
|
<ErrorBoundary
|
||||||
|
fallback={(err, reset) => (
|
||||||
|
<ErrorPage error={err} onRetry={reset} />
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
<Suspense fallback={<AppLoader />}>
|
||||||
|
<Router>
|
||||||
|
{/* Routes */}
|
||||||
|
</Router>
|
||||||
|
</Suspense>
|
||||||
|
</ErrorBoundary>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Async Error Handling
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function DataComponent() {
|
||||||
|
const [data] = createResource(fetchData);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Switch>
|
||||||
|
<Match when={data.loading}>
|
||||||
|
<Loading />
|
||||||
|
</Match>
|
||||||
|
<Match when={data.error}>
|
||||||
|
<Error error={data.error} onRetry={() => refetch()} />
|
||||||
|
</Match>
|
||||||
|
<Match when={data()}>
|
||||||
|
{(data) => <Content data={data()} />}
|
||||||
|
</Match>
|
||||||
|
</Switch>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Accessibility Patterns
|
||||||
|
|
||||||
|
### Focus Management
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function Modal(props: ParentProps<{ isOpen: boolean; onClose: () => void }>) {
|
||||||
|
let dialogRef: HTMLDivElement;
|
||||||
|
let previousFocus: HTMLElement | null;
|
||||||
|
|
||||||
|
createEffect(() => {
|
||||||
|
if (props.isOpen) {
|
||||||
|
previousFocus = document.activeElement as HTMLElement;
|
||||||
|
dialogRef.focus();
|
||||||
|
} else if (previousFocus) {
|
||||||
|
previousFocus.focus();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
return (
|
||||||
|
<Show when={props.isOpen}>
|
||||||
|
<Portal>
|
||||||
|
<div
|
||||||
|
ref={dialogRef!}
|
||||||
|
role="dialog"
|
||||||
|
aria-modal="true"
|
||||||
|
tabIndex={-1}
|
||||||
|
onKeyDown={(e) => e.key === "Escape" && props.onClose()}
|
||||||
|
>
|
||||||
|
{props.children}
|
||||||
|
</div>
|
||||||
|
</Portal>
|
||||||
|
</Show>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Live Regions
|
||||||
|
|
||||||
|
```tsx
|
||||||
|
function Notifications() {
|
||||||
|
const [message, setMessage] = createSignal("");
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div
|
||||||
|
role="status"
|
||||||
|
aria-live="polite"
|
||||||
|
aria-atomic="true"
|
||||||
|
class="sr-only"
|
||||||
|
>
|
||||||
|
{message()}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
```
|
||||||
223
profiles/opencode/skill/spec-planner/SKILL.md
Normal file
223
profiles/opencode/skill/spec-planner/SKILL.md
Normal file
@@ -0,0 +1,223 @@
|
|||||||
|
---
|
||||||
|
name: spec-planner
|
||||||
|
description: Dialogue-driven spec development through skeptical questioning and iterative refinement. Triggers: "spec this out", feature planning, architecture decisions, "is this worth it?" questions, RFC/design doc creation, work scoping. Invoke Librarian for unfamiliar tech/frameworks/APIs.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Spec Planner
|
||||||
|
|
||||||
|
Produce implementation-ready specs through rigorous dialogue and honest trade-off analysis.
|
||||||
|
|
||||||
|
## Core Philosophy
|
||||||
|
|
||||||
|
- **Dialogue over deliverables** - Plans emerge from discussion, not assumption
|
||||||
|
- **Skeptical by default** - Requirements are incomplete until proven otherwise
|
||||||
|
- **Second-order thinking** - Consider downstream effects and maintenance burden
|
||||||
|
|
||||||
|
## Workflow Phases
|
||||||
|
|
||||||
|
```
|
||||||
|
CLARIFY --[user responds]--> DISCOVER --[done]--> DRAFT --[complete]--> REFINE --[approved]--> DONE
|
||||||
|
| | | |
|
||||||
|
+--[still ambiguous]--<------+-------------------+----[gaps found]------+
|
||||||
|
```
|
||||||
|
|
||||||
|
**State phase at end of every response:**
|
||||||
|
```
|
||||||
|
---
|
||||||
|
Phase: CLARIFY | Waiting for: answers to questions 1-4
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 1: CLARIFY (Mandatory)
|
||||||
|
|
||||||
|
**Hard rule:** No spec until user has responded to at least one round of questions.
|
||||||
|
|
||||||
|
1. **STOP.** Do not proceed to planning.
|
||||||
|
2. Identify gaps in: scope, motivation, constraints, edge cases, success criteria
|
||||||
|
3. Ask 3-5 pointed questions that would change the approach. USE YOUR QUESTION TOOL.
|
||||||
|
4. **Wait for responses**
|
||||||
|
|
||||||
|
**IMPORTANT: Always use the `question` tool to ask clarifying questions.** Do NOT output questions as freeform text. The question tool provides structured options and better UX. Example:
|
||||||
|
|
||||||
|
```
|
||||||
|
question({
|
||||||
|
questions: [{
|
||||||
|
header: "Scope",
|
||||||
|
question: "Which subsystems need detailed specs?",
|
||||||
|
options: [
|
||||||
|
{ label: "VCS layer", description: "jj-lib + gix unified interface" },
|
||||||
|
{ label: "Review workflow", description: "GitHub PR-style local review" },
|
||||||
|
{ label: "Event system", description: "pub/sub + persistence" }
|
||||||
|
],
|
||||||
|
multiple: true
|
||||||
|
}]
|
||||||
|
})
|
||||||
|
```
|
||||||
|
|
||||||
|
| Category | Example |
|
||||||
|
|----------|---------|
|
||||||
|
| Scope | "Share where? Social media? Direct link? Embed?" |
|
||||||
|
| Motivation | "What user problem are we actually solving?" |
|
||||||
|
| Constraints | "Does this need to work with existing privacy settings?" |
|
||||||
|
| Success | "How will we know this worked?" |
|
||||||
|
|
||||||
|
**Escape prevention:** Even if request seems complete, ask 2+ clarifying questions. Skip only for mechanical requests (e.g., "rename X to Y").
|
||||||
|
|
||||||
|
**Anti-patterns to resist:**
|
||||||
|
- "Just give me a rough plan" -> Still needs scope questions
|
||||||
|
- "I'll figure out the details" -> Those details ARE the spec
|
||||||
|
- Very long initial request -> Longer != clearer; probe assumptions
|
||||||
|
|
||||||
|
**Transition:** User answered AND no new ambiguities -> DISCOVER
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 2: DISCOVER
|
||||||
|
|
||||||
|
**After clarification, before planning:** Understand existing system.
|
||||||
|
|
||||||
|
Launch explore subagents in parallel:
|
||||||
|
|
||||||
|
```
|
||||||
|
Task(
|
||||||
|
subagent_type="explore",
|
||||||
|
description="Explore [area name]",
|
||||||
|
prompt="Explore [area]. Return: key files, abstractions, patterns, integration points."
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
| Target | What to Find |
|
||||||
|
|--------|--------------|
|
||||||
|
| Affected area | Files, modules that will change |
|
||||||
|
| Existing patterns | How similar features are implemented |
|
||||||
|
| Integration points | APIs, events, data flows touched |
|
||||||
|
|
||||||
|
**If unfamiliar tech involved**, invoke Librarian:
|
||||||
|
|
||||||
|
```
|
||||||
|
Task(
|
||||||
|
subagent_type="librarian",
|
||||||
|
description="Research [tech name]",
|
||||||
|
prompt="Research [tech] for [use case]. Return: recommended approach, gotchas, production patterns."
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
**Output:** Brief architecture summary before proposing solutions.
|
||||||
|
|
||||||
|
**Transition:** System context understood -> DRAFT
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 3: DRAFT
|
||||||
|
|
||||||
|
Apply planning framework from [decision-frameworks.md](./references/decision-frameworks.md):
|
||||||
|
|
||||||
|
1. **Problem Definition** - What are we solving? For whom? Cost of not solving?
|
||||||
|
2. **Constraints Inventory** - Time, system, knowledge, scope ceiling
|
||||||
|
3. **Solution Space** - Simplest -> Balanced -> Full engineering solution
|
||||||
|
4. **Trade-off Analysis** - See table format in references
|
||||||
|
5. **Recommendation** - One clear choice with reasoning
|
||||||
|
|
||||||
|
Use appropriate template from [templates.md](./references/templates.md):
|
||||||
|
- **Quick Decision** - Scoped technical choices
|
||||||
|
- **Feature Plan** - New feature development
|
||||||
|
- **ADR** - Architecture decisions
|
||||||
|
- **RFC** - Larger proposals
|
||||||
|
|
||||||
|
**Transition:** Spec produced -> REFINE
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 4: REFINE
|
||||||
|
|
||||||
|
Run completeness check:
|
||||||
|
|
||||||
|
| Criterion | Check |
|
||||||
|
|-----------|-------|
|
||||||
|
| Scope bounded | Every deliverable listed; non-goals explicit |
|
||||||
|
| Ambiguity resolved | No "TBD" or "to be determined" |
|
||||||
|
| Acceptance testable | Each criterion pass/fail verifiable |
|
||||||
|
| Dependencies ordered | Clear what blocks what |
|
||||||
|
| Types defined | Data shapes specified (not "some object") |
|
||||||
|
| Effort estimated | Each deliverable has S/M/L/XL |
|
||||||
|
| Risks identified | At least 2 risks with mitigations |
|
||||||
|
| Open questions | Resolved OR assigned owner |
|
||||||
|
|
||||||
|
**If any criterion fails:** Return to dialogue. "To finalize, I need clarity on: [failing criteria]."
|
||||||
|
|
||||||
|
**Transition:** All criteria pass + user approval -> DONE
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Phase 5: DONE
|
||||||
|
|
||||||
|
### Final Output
|
||||||
|
|
||||||
|
```
|
||||||
|
=== Spec Complete ===
|
||||||
|
|
||||||
|
Phase: DONE
|
||||||
|
Type: <feature plan | architecture decision | refactoring | strategy>
|
||||||
|
Effort: <S/M/L/XL>
|
||||||
|
Status: Ready for task breakdown
|
||||||
|
|
||||||
|
Discovery:
|
||||||
|
- Explored: <areas investigated>
|
||||||
|
- Key findings: <relevant architecture/patterns>
|
||||||
|
|
||||||
|
Recommendation:
|
||||||
|
<brief summary>
|
||||||
|
|
||||||
|
Key Trade-offs:
|
||||||
|
- <what we're choosing vs alternatives>
|
||||||
|
|
||||||
|
Deliverables (Ordered):
|
||||||
|
1. [D1] (effort) - depends on: -
|
||||||
|
2. [D2] (effort) - depends on: D1
|
||||||
|
|
||||||
|
Open Questions:
|
||||||
|
- [ ] <if any remain> -> Owner: [who]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Write Spec to File (MANDATORY)
|
||||||
|
|
||||||
|
1. Derive filename from feature/decision name (kebab-case)
|
||||||
|
2. Write spec to `specs/<filename>.md`
|
||||||
|
3. Confirm: `Spec written to: specs/<filename>.md`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Effort Estimates
|
||||||
|
|
||||||
|
| Size | Time | Scope |
|
||||||
|
|------|------|-------|
|
||||||
|
| **S** | <1 hour | Single file, isolated change |
|
||||||
|
| **M** | 1-3 hours | Few files, contained feature |
|
||||||
|
| **L** | 1-2 days | Cross-cutting, multiple components |
|
||||||
|
| **XL** | >2 days | Major refactor, new system |
|
||||||
|
|
||||||
|
## Scope Control
|
||||||
|
|
||||||
|
When scope creeps:
|
||||||
|
1. **Name it:** "That's scope expansion. Let's finish X first."
|
||||||
|
2. **Park it:** "Added to Open Questions. Revisit after core spec stable."
|
||||||
|
3. **Cost it:** "Adding Y changes effort from M to XL. Worth it?"
|
||||||
|
|
||||||
|
**Hard rule:** If scope changes, re-estimate and flag explicitly.
|
||||||
|
|
||||||
|
## References
|
||||||
|
|
||||||
|
| File | When to Read |
|
||||||
|
|------|--------------|
|
||||||
|
| [templates.md](./references/templates.md) | Output formats for plans, ADRs, RFCs |
|
||||||
|
| [decision-frameworks.md](./references/decision-frameworks.md) | Complex multi-factor decisions |
|
||||||
|
| [estimation.md](./references/estimation.md) | Breaking down work, avoiding underestimation |
|
||||||
|
| [technical-debt.md](./references/technical-debt.md) | Evaluating refactoring ROI |
|
||||||
|
|
||||||
|
## Integration
|
||||||
|
|
||||||
|
| Agent | When to Invoke |
|
||||||
|
|-------|----------------|
|
||||||
|
| **Librarian** | Research unfamiliar tech, APIs, frameworks |
|
||||||
|
| **Oracle** | Deep architectural analysis, complex debugging |
|
||||||
@@ -0,0 +1,75 @@
|
|||||||
|
# Decision Frameworks
|
||||||
|
|
||||||
|
## Reversibility Matrix
|
||||||
|
|
||||||
|
| Decision Type | Approach |
|
||||||
|
|---------------|----------|
|
||||||
|
| **Two-way door** (easily reversed) | Decide fast, learn from outcome |
|
||||||
|
| **One-way door** (hard to reverse) | Invest time in analysis |
|
||||||
|
|
||||||
|
Most decisions are two-way doors. Don't over-analyze.
|
||||||
|
|
||||||
|
## Cost of Delay
|
||||||
|
|
||||||
|
```
|
||||||
|
Daily Cost = (Value Delivered / Time to Deliver) x Risk Factor
|
||||||
|
```
|
||||||
|
|
||||||
|
Use when prioritizing:
|
||||||
|
- High daily cost -> Do first
|
||||||
|
- Low daily cost -> Can wait
|
||||||
|
|
||||||
|
## RICE Scoring
|
||||||
|
|
||||||
|
| Factor | Question | Scale |
|
||||||
|
|--------|----------|-------|
|
||||||
|
| **R**each | How many users affected? | # users/period |
|
||||||
|
| **I**mpact | How much per user? | 0.25, 0.5, 1, 2, 3 |
|
||||||
|
| **C**onfidence | How sure are we? | 20%, 50%, 80%, 100% |
|
||||||
|
| **E**ffort | Person-weeks | 0.5, 1, 2, 4, 8+ |
|
||||||
|
|
||||||
|
```
|
||||||
|
RICE = (Reach x Impact x Confidence) / Effort
|
||||||
|
```
|
||||||
|
|
||||||
|
## Technical Decision Checklist
|
||||||
|
|
||||||
|
Before committing to a technical approach:
|
||||||
|
|
||||||
|
- [ ] Have we talked to someone who's done this before?
|
||||||
|
- [ ] What's the simplest version that teaches us something?
|
||||||
|
- [ ] What would make us reverse this decision?
|
||||||
|
- [ ] Who maintains this in 6 months?
|
||||||
|
- [ ] What's our rollback plan?
|
||||||
|
|
||||||
|
## When to Build vs Buy vs Adopt
|
||||||
|
|
||||||
|
| Signal | Build | Buy | Adopt (OSS) |
|
||||||
|
|--------|-------|-----|-------------|
|
||||||
|
| Core differentiator | Yes | No | Maybe |
|
||||||
|
| Commodity problem | No | Yes | Yes |
|
||||||
|
| Tight integration needed | Yes | Maybe | Maybe |
|
||||||
|
| Team has expertise | Yes | N/A | Yes |
|
||||||
|
| Time pressure | No | Yes | Maybe |
|
||||||
|
| Long-term control needed | Yes | No | Maybe |
|
||||||
|
|
||||||
|
## Decomposition Strategies
|
||||||
|
|
||||||
|
### Vertical Slicing
|
||||||
|
Cut features into thin end-to-end slices that deliver value:
|
||||||
|
```
|
||||||
|
Bad: "Build database layer" -> "Build API" -> "Build UI"
|
||||||
|
Good: "User can see their profile" -> "User can edit name" -> "User can upload avatar"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Risk-First Ordering
|
||||||
|
1. Identify highest-risk unknowns
|
||||||
|
2. Build spike/proof-of-concept for those first
|
||||||
|
3. Then build around proven foundation
|
||||||
|
|
||||||
|
### Dependency Mapping
|
||||||
|
```
|
||||||
|
[Feature A] -depends on-> [Feature B] -depends on-> [Feature C]
|
||||||
|
^
|
||||||
|
Start here
|
||||||
|
```
|
||||||
@@ -0,0 +1,69 @@
|
|||||||
|
# Estimation
|
||||||
|
|
||||||
|
## Why Estimates Fail
|
||||||
|
|
||||||
|
| Cause | Mitigation |
|
||||||
|
|-------|------------|
|
||||||
|
| Optimism bias | Use historical data, not gut |
|
||||||
|
| Missing scope | List "obvious" tasks explicitly |
|
||||||
|
| Integration blindness | Add 20-30% for glue code |
|
||||||
|
| Unknown unknowns | Add buffer based on novelty |
|
||||||
|
| Interruptions | Assume 60% focused time |
|
||||||
|
|
||||||
|
## Estimation Techniques
|
||||||
|
|
||||||
|
### Three-Point Estimation
|
||||||
|
```
|
||||||
|
Expected = (Optimistic + 4xMostLikely + Pessimistic) / 6
|
||||||
|
```
|
||||||
|
|
||||||
|
### Relative Sizing
|
||||||
|
Compare to known references:
|
||||||
|
- "This is about twice as complex as Feature X"
|
||||||
|
- Use Fibonacci (1, 2, 3, 5, 8, 13) to reflect uncertainty
|
||||||
|
|
||||||
|
### Task Decomposition
|
||||||
|
1. Break into tasks <=4 hours
|
||||||
|
2. If can't decompose, spike first
|
||||||
|
3. Sum tasks + 20% integration buffer
|
||||||
|
|
||||||
|
## Effort Multipliers
|
||||||
|
|
||||||
|
| Factor | Multiplier |
|
||||||
|
|--------|------------|
|
||||||
|
| New technology | 1.5-2x |
|
||||||
|
| Unclear requirements | 1.3-1.5x |
|
||||||
|
| External dependencies (waiting on others) | 1.2-1.5x |
|
||||||
|
| Legacy/undocumented code | 1.3-2x |
|
||||||
|
| Production deployment | 1.2x |
|
||||||
|
| First time doing X | 2-3x |
|
||||||
|
| Context switching (other priorities) | 1.3x |
|
||||||
|
| Yak shaving risk (unknown unknowns) | 1.5x |
|
||||||
|
|
||||||
|
## Hidden Work Checklist
|
||||||
|
|
||||||
|
Always include time for:
|
||||||
|
- [ ] Code review (20% of dev time)
|
||||||
|
- [ ] Testing (30-50% of dev time)
|
||||||
|
- [ ] Documentation (10% of dev time)
|
||||||
|
- [ ] Deployment/config (varies)
|
||||||
|
- [ ] Bug fixes from testing (20% buffer)
|
||||||
|
- [ ] Interruptions / competing priorities
|
||||||
|
|
||||||
|
## When to Re-Estimate
|
||||||
|
|
||||||
|
Re-estimate when:
|
||||||
|
- Scope changes materially
|
||||||
|
- Major unknown becomes known
|
||||||
|
- Actual progress diverges >30% from estimate
|
||||||
|
|
||||||
|
## Communicating Estimates
|
||||||
|
|
||||||
|
**Good:** "1-2 weeks, confidence 70%-main risk is the third-party API integration"
|
||||||
|
|
||||||
|
**Bad:** "About 2 weeks"
|
||||||
|
|
||||||
|
Always include:
|
||||||
|
1. Range, not point estimate
|
||||||
|
2. Confidence level
|
||||||
|
3. Key assumptions/risks
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user