Andrew's Software
My open source software is packaged and deployed from a central repository: anixpkgs. See the navigation menu for individual package documentation.
LATEST RELEASE: v6.9.1
This repository of personally maintained Nix derivations, overlays, and machine closures is essentially the centralized mechanism by which I maintain all of the software I write and use for both personal projects and recreation. In other words, I employ Nix as both a package manager for my software as well as an operating system for all of my computers, Raspberry Pi’s, etc.
These docs provide an overview of how I manage the OS’s of my machines as well as the software that I personally maintain, all within the anixpkgs repo.
Why Nix?
Some of the main reasons why I prefer Nix as a package manager:
- I highly value code that is not only compelling in its application but that is also maintainable. Code that is subject to compiler/interpreter and external dependency changes over time must be designed with the future in mind. Nix provides me an almost trivial mechanism to incrementally update (and roll back) external dependencies, compilers, or anything else pertaining to a software ecosystem that you can think of.
- When I write a cool piece of software on one machine, I want to be able to “deploy” that software across all my machines with minimal effort and without having to worry about broken or missing dependencies. With its hermetic build system, I have the peace of mind that the code that I package in Nix will be transferable to essentially any other machine that uses Nix.
Some of the main reasons why I prefer NixOS as an operating system for all of my computers:
- The same things I value in packaging software, I also value in “packaging an operating system.” NixOS allows me to have total control over every single package in my OS, allowing me to customize every aspect and make changes with the peace of mind that I can always roll back breaking changes.
- There is something very satisfying and empowering to me about being able to declaratively define the OS closures for all of my machines in just several text files. The overlay-focused design of NixOS modules makes it so that I can design the OS’s of my machines hierarchically, defining packages that are shared between all of my computers as well as packages that are specific to certain computers only. Moreover, when I buy a replacement computer it takes a minimal amount of steps to turn that new computer into an all-intents-and-purposes clone of my old one, which is a capability I value very highly for a lot of reasons.
Given the above, why do I prefer Nix over Docker?
- To be clear, I do think that Nix and Docker can be used together effectively. However:
- In general, one will require a mishmash of custom or third-party build and deployment tooling to construct and glue a bunch of Docker containers together if one is trying to architect a complete system using Docker (as could be the case with code running on a robot). Nix provides more of a unified framework to achieve the same benefits, and that ecosystem is much more aesthetically pleasing to me than e.g., "YAML engineering."
- Docker containers sit atop an already existing, fully fleshed out operating system. Nix allows me to (once agin, within a unified framework) control literally everything about even the operating system in an attempt to avoid unintended side effects at all levels of integration.
Installation and Usage Patterns
The packages defined in this repo are accessible to anyone who uses Nix, which can be installed in two forms:
- “Standalone” Nix: This will just install the package manager and is the easiest option if you just want access to the packages in this repo. This option could be augmented with a tool called home-manager to at least be able to use some of the closure components alongside your normal OS as well.
- NixOS: This option is much more invasive as it wholesale replaces your entire operating system, and should only be done if you really know what you’re doing (and love Nix). More instructions in the machines documentation.
For either method, ensure that your Nix version is >= 2.4
.
The software packaged in anixpkgs
is buildable both through Nix flakes as well as through traditional Nix shells. It’s recommended to use flakes, as that method is more "pure" and allows for more portable integration with the public cache.
Accessing the Packages Using Flakes
Here is a flake.nix
file that will get you a shell with select anixpkgs
software (version v6.9.1
) while also giving you access to the public cache to avoid building from source on your machine:
{
description = "Nix shell for anixpkgs.";
nixConfig.substituters = [
"https://cache.nixos.org/"
"https://github-public.cachix.org"
];
nixConfig.trusted-public-keys = [
"cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY="
"github-public.cachix.org-1:xofQDaQZRkCqt+4FMyXS5D6RNenGcWwnpAXRXJ2Y5kc="
];
inputs = {
nixpkgs.url = "github:goromal/anixpkgs?ref=refs/tags/v6.9.1";
};
outputs = { self, nixpkgs }:
let pkgs = nixpkgs.legacyPackages.x86_64-linux;
in with pkgs; {
devShell.x86_64-linux = mkShell {
buildInputs = [
pb
fixfname
pkgshell
];
};
};
}
Access the packages with nix develop
.
Accessing the Packages Using shell.nix
Here are some shell.nix
files to access Python packages (using version v6.9.1
of the packages):
let
pkgs = import (builtins.fetchTarball
"https://github.com/goromal/anixpkgs/archive/refs/tags/v6.9.1.tar.gz") {};
python-with-my-packages = pkgs.python311.withPackages (p: with p; [
numpy
matplotlib
geometry
pyceres
]);
in
python-with-my-packages.env
or:
let
pkgs = import (builtins.fetchTarball
"https://github.com/goromal/anixpkgs/archive/refs/tags/v6.9.1.tar.gz") {};
in pkgs.mkShell {
buildInputs = [
pkgs.python311
pkgs.python311.pkgs.numpy
pkgs.python311.pkgs.geometry
pkgs.python311.pkgs.find_rotational_conventions
];
shellHook = ''
# Tells pip to put packages into $PIP_PREFIX instead of the usual locations.
# See https://pip.pypa.io/en/stable/user_guide/#environment-variables.
export PIP_PREFIX=$(pwd)/_build/pip_packages
export PYTHONPATH="$PIP_PREFIX/${pkgs.python311.sitePackages}:$PYTHONPATH"
export PATH="$PIP_PREFIX/bin:$PATH"
unset SOURCE_DATE_EPOCH
'';
}
And for general software packages:
let
pkgs = import (builtins.fetchTarball
"https://github.com/goromal/anixpkgs/archive/refs/tags/v6.9.1.tar.gz") {};
in with pkgs; mkShell {
buildInputs = [
pb
fixfname
pkgshell
];
}
Access the packages with nix-shell
.
Machine Management
These notes are still a work-in-progress and are currently largely for my personal use only.
Home-Manager Example
- Install Nix standalone:
curl --proto '=https' --tlsv1.2 -sSf -L https://install.determinate.systems/nix | sh -s -- install
- Set proper Nix settings in
/etc/nix/nix.conf
:
substituters = https://cache.nixos.org/ https://github-public.cachix.org
trusted-public-keys = cache.nixos.org-1:6NCHdD59X431o0gWypbMrAURkbJ16ZPMQFGspcDShjY= github-public.cachix.org-1:xofQDaQZRkCqt+4FMyXS5D6RNenGcWwnpAXRXJ2Y5kc=
narinfo-cache-positive-ttl = 0
narinfo-cache-negative-ttl = 0
experimental-features = nix-command flakes auto-allocate-uids
- Add these Nix channels via
nix-channel --add URL NAME
:
$ nix-channel --list
home-manager https://github.com/nix-community/home-manager/archive/release-24.05.tar.gz
nixpkgs https://nixos.org/channels/nixos-24.05
- Install home-manager: https://nix-community.github.io/home-manager/index.xhtml#sec-install-standalone
Example home.nix
file for personal use:
{ config, pkgs, lib, ... }:
let
user = "andrew";
homedir = "/home/${user}";
anixsrc = ./path/to/sources/anixpkgs/.;
in with import ../dependencies.nix; {
home.username = user;
home.homeDirectory = homedir;
programs.home-manager.enable = true;
imports = [
"${anixsrc}/pkgs/nixos/components/opts.nix"
"${anixsrc}/pkgs/nixos/components/base-pkgs.nix"
"${anixsrc}/pkgs/nixos/components/base-dev-pkgs.nix"
"${anixsrc}/pkgs/nixos/components/x86-rec-pkgs.nix"
"${anixsrc}/pkgs/nixos/components/x86-graphical-pkgs.nix"
"${anixsrc}/pkgs/nixos/components/x86-graphical-dev-pkgs.nix"
"${anixsrc}/pkgs/nixos/components/x86-graphical-rec-pkgs.nix"
];
mods.opts.standalone = true;
mods.opts.homeDir = homedir;
mods.opts.homeState = "23.05";
mods.opts.browserExec = "google-chrome-stable";
}
*-rec-*
packages can be removed for non-recreational use.
Symlink to ~/.config/home-manager/home.nix
.
Corresponding ~/.bashrc
:
export NIX_PATH=$HOME/.nix-defexpr/channels:/nix/var/nix/profiles/per-user/root/channels${NIX_PATH:+:$NIX_PATH}
. "$HOME/.nix-profile/etc/profile.d/hm-session-vars.sh"
export NIXPKGS_ALLOW_UNFREE=1
# alias code='codium'
# eval "$(direnv hook bash)"
Build and Deploy a Raspberry Pi NixOS SD Configuration
Since the hardware configuration for the Raspberry Pi is well understood, it makes sense to skip the installer step and deploy a fully-fledged clusure instead.
nixos-generate -f sd-aarch64 --system aarch64-linux -c /path/to/anixpkgs/pkgs/nixos/configurations/config.nix [-I nixpkgs=/path/to/alternative/nixpkgs]
nix-shell -p zstd --run "unzstd -d /nix/store/path/to/image.img.zst"
sudo dd if=/path/to/image.img of=/dev/sdX bs=4096 conv=fsync status=progress
On the Pi, connect to the internet, copy over SSH keys (maybe no need for /root/.ssh/
) and then set up the Nix channel(s):
sudo nix-channel --add https://nixos.org/channels/nixos-[NIXOS-VERSION] nixos
sudo nix-channel --update
Note that the nixos-generate
step may not have "aarch-ified" the anixpkgs
packages (that's something for me to look into) so the anix-upgrade
setup steps are especially important:
- Make a
~/sources
directory - Symlink the configuration file even if it doesn't exist yet
- Run
anix-upgrade
to aarch-ify everything
Build a Raspberry Pi NixOS SD Installer Image
nixos-generate -f sd-aarch64-installer --system aarch64-linux -c /path/to/rpi/config.nix [-I nixpkgs=/path/to/alternative/nixpkgs]
nix-shell -p zstd --run "unzstd -d /nix/store/path/to/image.img.zst"
sudo dd if=/path/to/image.img of=/dev/sdX bs=4096 conv=fsync status=progress
On the Pi, copy over SSH keys (including to /root/.ssh/
!) and then set up the Nix channel:
sudo nix-channel --add https://nixos.org/channels/nixos-[NIXOS-VERSION] nixos
sudo nix-channel --update
Installation Instructions on a New Machine
Sources
- https://nixos.wiki/wiki/NixOS_Installation_Guide
- https://alexherbo2.github.io/wiki/nixos/install-guide/
Note: You can replace steps 1-8 with a kexec
kernel load and disk formatting with disko
:
- Download a NixOS ISO image.
- Plug in a USB stick large enough to accommodate the image.
- Find the right device with
lsblk
orfdisk -l
. Replace/dev/sdX
with the proper device (do not use/dev/sdX1
or partitions of the disk; use the whole disk/dev/sdX
). - Burn ISO to USB stick with
cp nixos-xxx.iso /dev/sdX
# OR
dd if=nixos.iso of=/dev/sdX bs=4M status=progress conv=fdatasync
- On the new machine, one-time boot UEFI into the USB stick on the computer (will need to disable Secure Boot from BIOS first)
- Wipe the file system:
wipefs [--all -a] /dev/sda
gparted
- Create a GUID table: Device > Create Partition Table > GPT
- Select
/dev/sda
- Entire disk
- Select
- Create the boot partition: Partition > New
- Free space preceding (MiB): 1
- New size (MiB): 512
- Free space following (MiB): Rest
- Align to: MiB
- Create as: Primary Partition
- Partition name: EFI
- File system:
fat32
- Label: EFI
- Add the
boot
flag- Right-click on
/dev/sda1
to manage flags - Add the
boot
flag and enableesp
(should be automatic with GPT)
- Right-click on
- Create the root partition: Partition > New
- Free space preceding (MiB): 0
- New size (MiB): Rest
- Free space following (MiB): 0
- Align to: MiB
- Create as: Primary Partition
- Partition name: NixOS
- File system:
ext4
- Label: NixOS
- Apply modifications
- Create a GUID table: Device > Create Partition Table > GPT
- Mount root and boot partitions:
mkdir /mnt/nixos
mount /dev/disk/by-label/NixOS /mnt/nixos
mkdir /mnt/nixos/boot
mount /dev/disk/by-label/EFI /mnt/nixos/boot
- Generate an initial configuration (you'll want it to enable WiFi connectivity and a web browser at least):
nixos-generate-config --root /mnt/nixos
# /etc/nixos/configuration.nix
# /etc/nixos/hardware-configuration.nix
- Do the installation:
nixos-install --root /mnt/nixos
- If everything went well:
reboot
- Log into Github and generate an SSH key for authentication.
- Clone and link an editable version of the configuration:
mkdir -p /data/andrew/sources # or in an alternate location, for now
git clone git@github.com:goromal/anixpkgs.git /data/andrew/sources/anixpkgs
cat /etc/nixos/hardware-configuration.nix > /data/andrew/sources/anixpkgs/pkgs/nixos/hardware/[hardware-configuration.nix] # update link/headings in configuration.nix
sudo mv /etc/nixos/configuration.nix /etc/nixos/old.configuration.nix
sudo mv /etc/nixos/hardware-configuration.nix /etc/nixos/old.hardware-configuration.nix
sudo ln -s /data/andrew/sources/anixpkgs/pkgs/nixos/configurations/[your-configuration.nix] /etc/nixos/configuration.nix
- Make other needed updates to the configuration, then apply:
sudo nixos-rebuild boot
sudo reboot
Upgrading NixOS versions with anixpkgs
Aside from the source code changes in anixpkgs
, ensure that your channels have been updated for the root user:
# e.g., upgrading to 24.05:
home-manager https://github.com/nix-community/home-manager/archive/release-24.05.tar.gz
nixos https://nixos.org/channels/nixos-24.05
nixpkgs https://nixos.org/channels/nixos-24.05
sudo nix-channel --update
. Then upgrade with
anix-upgrade [source specification] --local --boot
Cloud Syncing
The following mount points are recommended (using rclone to set up):
dropbox:secrets
->rclone copy
->~/secrets
dropbox:configs
->rclone copy
->~/configs
dropbox:Games
->rclone copy
->~/games
box:data
->rclone copy
->~/data
box:.devrc
->rclone copy
->~/.devrc
drive:Documents
->rclone copy
->~/Documents
Build a NixOS ISO Image
TODO (untested); work out hardware configuration portion.
nixos-generate -f iso -c /path/to/personal/configuration.nix [-I nixpkgs=/path/to/alternative/nixpkgs]
sudo dd if=/path/to/nixos.iso of=/dev/sdX bs=4M conv=fsync status=progress
RFCs
Request For Comments (RFC) documents are convenient for organizing and iterating on designs for pieces of software. They can serve as north stars for the implementation of more complex software. Below are some examples of RFCs that I've used to guide some personal projects.
RFC: Continuous OS Deployment
Summary
Operating System Continuous Deployment (OSCD) refers to the process of seamlessly upgrading my NixOS machines to the latest and greatest release with as little overhead and manual fiddling as possible. This RFC proposes an OSCD overhaul that achieves a greater level of automation and reproducibility in the OS upgrade process via a few added GitHub CD hooks, the new CLI tool anix-upgrade, and some development process changes.
Motivation
NixOS upgrades on my machines are done using the nixos-rebuild command, which builds the system configuration according to the (symlinked) file /etc/nixos/configuration.nix
, which points to a mutable root configuration file in ~/sources/anixpkgs/
.
This model lends itself best to rapid prototyping and test, as changes to the configuration can be immediately tested without having to commit any code. However, release deployment strategies are left to be more ad hoc and manual (and thus error-prone) under this model, as well. For example, to tag and deploy a new release off of master, the following manual steps must be taken:
- The desired release commit is tagged, either locally or via the GitHub releases page.
- The
dependencies.nix
file for all NixOS configurations is modified to refer to the new tag, manifesting as a new commit on the head of master. ~/sources/anixpkgs/
is checked out to the commit from step 2 (not step 1, ironically). Any local dev changes need to be stashed.- A
nixos-rebuild
command is run.
This RFC calls for a mature, stable OSCD pipeline that removes the need for the manual steps listed above while maintaining flexibility for rapid (and decoupled) on-machine development and testing.
Driving requirements
System-level requirements
- [R1] OSCD shall be totally decoupled from development work; there shall be no possibility of one accidentally polluting the other.
- [R2] OSCD shall be atomic such that an upgrade cannot be corrupted via a mismanagement or erroneous execution of steps.
- [R3] OS release tagging shall be wholly executable within an
anixpkgs
pull request, and shall be entirely automated except for a manual specification of the level of release (e.g., major, minor, patch) that the pull request corresponds to. - [R4] An OS upgrade on-machine shall take no more than one step to complete.
- [R5] The same upgrade mechanism from [SR4] shall enable rapid prototyping of active development branches.
Software-level requirements
- [R1.1] Development within
anixpkgs
shall happen in a separate location from the symlinked~/sources/anixpkgs
directory, which shall be reserved for OS upgrades. - [R1.2]
~/sources/anixpkgs
shall be read-only. - [R2.1] The release tagging process shall execute all required steps automatically, prompted by a single initiatory step, within the remote CD pipeline.
- [R2.2] All automated steps for [R2.1] shall kick off only after a pull request merge into master, which is push-protected.
- [R2.3] The OS upgrade process shall consist of an atomic source preparation step followed by an atomic rebuild step, both chained together automatically.
- [R3.1] The tagging process from [R2.1] shall be initiated via adding labels to a pull request. No further action should be required.
- [R4.1] OS upgrades shall be offered by a CLI tool that, with no arguments specified, will upgrade the system to the most recent
anixpkgs
release off of master. - [R4.2] The OS upgrade CLI tool shall perform rebuild
switch
by default, but allow for rebuildboot
to be manually specified instead. - [R5.1] The OS upgrade CLI tool shall allow (via provided arguments) for an upgrade to any particular tag, branch, or commit within the remote
anixpkgs
repository. - [R5.2] The OS upgrade CLI tool shall allow for builds with packages local to the specified build target, and not necessarily tied to that target's prescribed release version.
Detailed design
The OSCD requirements are addressed with three components: an updated deployment
pipeline within anixpkgs
GitHub actions, an OS upgrade CLI tool, and a new policy for anixpkgs
development and testing.
Deployment pipeline
Requirements [R2.1-2,3.1] are proposed to be fulfilled by adding three GitHub actions jobs with write permissions on master to the deployment workflow. Each of these three jobs will be responsible for either tagging a new major release, minor release, or patch release, and will be triggered by a corresponding pull request label on a merged pull request only.
Each of these jobs will consist of the following steps:
- Only execute if a merged pull request had the appropriate release label.
- Checkout
anixpkgs
master. - Run a version increment script that increments the release version according to the release label.
- Commit the semantic version changes and tag that commit with the new semantic version string.
- Push the commit and corresponding tag to master.
The version increment script will simply take as an input the release increment type (major, minor, or patch) and increment accordingly:
- (Major)
x.y.z
->x+1.0.0
- (Minor)
x.y.z
->x.y+1.0
- (Patch)
x.y.z
->x.y.z+1
In the event of a pull request getting merged with multiple labels, the execution order of the release tag jobs will be major
-> minor
-> patch
, such that each successive tag will be visible in the resulting semantic version. If the order were reversed, for example, a patch release increment would be obscured by a minor release increment, which would zero out the patch field.
OS upgrade CLI tool: anix-upgrade
One a new release has been properly and automatically tagged on anixpkgs
remote, a CLI tool called anix-upgrade is proposed to upgrade the system to the latest tag in an automated and incorruptible (i.e., no need nor opportunity to manually modify code or any configurations during the process) fashion and fulfill [R1.2,2.3,4.1-2,5.1-2].
As implied by [R2.3], anix-upgrade
will do two things:
- Prepare a read-only (and Git-less) version of
anixpkgs
in the~/sources
directory according to the exact version specified via arguments through the CLI. - Rebuild the system configuration according to the updated source/configuration.
(1) may be naturally accomplished via a Nix derivation, which by definition must consist of a read-only (and reproducible) output with no tolerance for side effects from things like .git
directories. The CLI tool will allow for (1) to be constructed from:
- No argument at all, which will assume that the target source corresponds to the current head of
anixpkgs
master. This will be the way to perform standard, non-dev-prototyping upgrades. - An alternative release tag string or a development branch name (assumed to be at the head of that branch) or a commit hash.
- An optional specification that the OS packages should be built from the local packages in that specific version of the
anixpkgs
source tree, and not necessarily from the prescribed release version of the packages.
Because Nix derivations cannot clone Git repositories, the Nix built-in tools fetchGit
and fetchTarball
must be used to fetch the exact right version of the source. In my experience, fetchGit
does not consistently fetch the expected "head-of" commit when only a branch name is specified, so in this implementation fetchTarball
is to be used for any version of (1) that does not specify the full commit hash. This is possible because of GitHub's feature where it allows for URL-based fetching of zipped source archives from tag names and branch names (reliably delivering the head-of commit in the latter case).
(2) is accomplished by aliasing to the nixos-rebuild
command (exposing an optional flag to require a reboot for changes to take effect as per [R4.2]) and only executing once (1) has successfully completed.
Development policy changes
All-in-all, the above changes serve to reduce the cognitive load associated with releasing and deploying a new version of anixpkgs
once development is complete. In the same vein, the above changes are best served by fulfilling [R1.1] as well, which moves anixpkgs
development out of the ~/sources
directory and into the ~/dev
directory, consistent with development on individual repositories. While mostly a clerical change, there are a couple of implications:
- The machines docs should be updated to modify instructions for setting up a new machine from scratch. The setup process may now be slightly more complicated on a one-time basis.
- The
~/.devrc
file on every development machine (see devshell) should be modified to set thepkgs_dir
variable to the~/dev/
location and not the~/sources/
one. This is to preserve the ability to mutate attribute-based sources with the same level of flexibility as before.
Drawbacks
The principal drawback of this design is the new requirement to commit OS prototype changes to GitHub when prototyping new OS versions via the specification of a remote tag, branch, or commit with the OS upgrade tool.
Alternatives
To address the slight OS prototyping drawback, an additional option may be added to anix-upgrade
to point to a local anixpkgs
source tree, which would symlink to the mutable source tree rather than an immutable nix derivation that pulls from GitHub.
Unresolved questions
- Must OS upgrades always be manually triggered via
anix-upgrade
, or might it be fruitful to execute automatic upgrades in the future? - How may OSCD fruitfully apply to machine closures that import
anixpkgs
closures but specify their own configurations?
C++ Packages
Packages written in C++.
- manif-geom-cpp
- aapis-cpp
- mscpp
- quad-sim-cpp
- ceres-factors
- signals-cpp
- secure-delete
- sorting
- crowcpp
- rankserver-cpp
- mfn
- orchestrator-cpp
manif-geom-cpp
Templated, header-only implementations for SO(2), SE(2), SO(3), SE(3).
Operationally very similar to variations on Eigen's Quaternion<T>
class, but with added chart maps and rules for addition and subtraction on tangent spaces. Meant to be used with nonlinear least-squares solvers like Ceres Solver which take advantage of templating to implement auto-differentiation on arbitrary mathematical formulations in code.
The SO(3) math is based on my notes on 3D rotation representations.
Including in Your Project With CMake
# ...
find_package(Eigen3 REQUIRED)
find_package(manif-geom-cpp REQUIRED)
include_directories(
${EIGEN3_INCLUDE_DIRS}
)
# ...
target_link_libraries(target INTERFACE manif-geom-cpp)
Example Usage
Example usage of SO(3):
// action
SO3d q = SO3d::random();
Vector3d v;
v.setRandom();
Vector3d qv1 = q * v;
Vector3d qv2 = q.R() * v;
BOOST_CHECK_CLOSE(qv1.x(), qv2.x(), 1e-8);
BOOST_CHECK_CLOSE(qv1.y(), qv2.y(), 1e-8);
BOOST_CHECK_CLOSE(qv1.z(), qv2.z(), 1e-8);
// inversion and composition
SO3d q1 = SO3d::random();
SO3d q2 = SO3d::random();
SO3d q2inv = q2.inverse();
SO3d q1p = q1 * q2 * q2inv;
BOOST_CHECK_CLOSE(q1.w(), q1p.w(), 1e-8);
BOOST_CHECK_CLOSE(q1.x(), q1p.x(), 1e-8);
BOOST_CHECK_CLOSE(q1.y(), q1p.y(), 1e-8);
BOOST_CHECK_CLOSE(q1.z(), q1p.z(), 1e-8);
// Euler conversions
Vector3d euler;
euler.setRandom();
euler *= M_PI;
SO3d q = SO3d::fromEuler(euler.x(), euler.y(), euler.z());
SO3d q2 = SO3d::fromEuler(q.roll(), q.pitch(), q.yaw());
BOOST_CHECK_CLOSE(q.w(), q2.w(), 1e-8);
BOOST_CHECK_CLOSE(q.x(), q2.x(), 1e-8);
BOOST_CHECK_CLOSE(q.y(), q2.y(), 1e-8);
BOOST_CHECK_CLOSE(q.z(), q2.z(), 1e-8);
// plus / minus
SO3d q1 = SO3d::random();
Vector3d q12;
q12.setRandom();
SO3d q2 = q1 + q12;
Vector3d q12p = q2 - q1;
BOOST_CHECK_CLOSE(q12.x(), q12p.x(), 1e-8);
BOOST_CHECK_CLOSE(q12.y(), q12p.y(), 1e-8);
BOOST_CHECK_CLOSE(q12.z(), q12p.z(), 1e-8);
// chart maps
SO3d q = SO3d::random();
Vector3d w;
w.setRandom();
Vector3d qLog = SO3d::Log(q);
SO3d q2 = SO3d::Exp(qLog);
BOOST_CHECK_CLOSE(q.w(), q2.w(), 1e-8);
BOOST_CHECK_CLOSE(q.x(), q2.x(), 1e-8);
BOOST_CHECK_CLOSE(q.y(), q2.y(), 1e-8);
BOOST_CHECK_CLOSE(q.z(), q2.z(), 1e-8);
SO3d wExp = SO3d::Exp(w);
Vector3d w2 = SO3d::Log(wExp);
BOOST_CHECK_CLOSE(w.x(), w2.x(), 1e-8);
BOOST_CHECK_CLOSE(w.y(), w2.y(), 1e-8);
BOOST_CHECK_CLOSE(w.z(), w2.z(), 1e-8);
// scaling
SO3d qI = SO3d::identity();
SO3d qIs = 5.0 * qI;
BOOST_CHECK_CLOSE(qIs.w(), qI.w(), 1e-8);
BOOST_CHECK_CLOSE(qIs.x(), qI.x(), 1e-8);
BOOST_CHECK_CLOSE(qIs.y(), qI.y(), 1e-8);
BOOST_CHECK_CLOSE(qIs.z(), qI.z(), 1e-8);
SO3d qr = SO3d::random();
SO3d qr2 = qr * 0.2; // if scale is too big, then the rotation will
// wrap around the sphere, resulting in a reversed
// or truncated tangent vector which can't be inverted
// through scalar division
SO3d qr3 = qr2 / 0.2;
BOOST_CHECK_CLOSE(qr.w(), qr3.w(), 1e-8);
BOOST_CHECK_CLOSE(qr.x(), qr3.x(), 1e-8);
BOOST_CHECK_CLOSE(qr.y(), qr3.y(), 1e-8);
BOOST_CHECK_CLOSE(qr.z(), qr3.z(), 1e-8);
Conventions
Ordering
Scalar term first:
$$\mathbf{R} \in SO(2) \triangleq \begin{bmatrix} q_w & q_x \end{bmatrix}.$$
$$\mathbf{R} \in SO(3) \triangleq \begin{bmatrix} q_w & q_x & q_y & q_z \end{bmatrix}.$$
Handedness
Right-handed:
$$\mathbf{q}_1 \otimes \mathbf{q}_2=[\mathbf{q}_1]_L\mathbf{q}_2=[\mathbf{q}_2]_R\mathbf{q}_1,$$
$$[\mathbf{q}]_L \triangleq \begin{bmatrix}q_w & -q_x & -q_y & -q_z \\ q_x & q_w & -q_z & q_y \\ q_y & q_z & q_w & -q_x \\ q_z & -q_y & q_x & q_w\end{bmatrix},$$
$$[\mathbf{q}]_R \triangleq \begin{bmatrix}q_w & -q_x & -q_y & -q_z \\ q_x & q_w & q_z & -q_y \\ q_y & -q_z & q_w & q_x \\ q_z & q_y & -q_x & q_w \end{bmatrix}.$$
Function
Passive:
$$\mathbf{R}_A^B~^A\mathbf{v}=^B\mathbf{v}.$$
Directionality and Perturbation
Body-to-world with local perturbations:
$$\mathbf{R}_B^W \oplus \tilde{\theta} \triangleq \mathbf{R}_B^W \text{Exp}\left(\tilde{\theta}\right).$$
aapis-cpp
C++ bindings for my custom APIs.
mscpp
Useful template classes for creating multithreaded, interdependent microservices in C++.
Use cases pending.
quad-sim-cpp
C++ library and daemon for simulating quadrotor dynamics from PWM motor inputs.
Under construction.
ceres-factors
C++ library with custom parameterizations and cost functions for the Ceres Solver.
Examples documented in the unit tests.
Articles/tutorials showcasing some of the custom cost functions and parameterizations:
signals-cpp
Header-only templated C++ library implementing rigid-body dynamics, derivatives, integrals, and interpolation.
Examples documented in the unit tests.
secure-delete
Secure file deletion utility, written in C.
The deletion process is as follows:
- Overwrite the file with multiple passes. After each pass, the disk cache is flushed. The number of passes depends on the commanded mode:
- (default / secure mode) 38 passes:
- 1x overwrite with
0xff
. - 5x random passes.
- 27x overwrites with special values to make the recovery from MFM- and RLL-encoded hard disks hard to impossible.
- 5x random passes.
- 1x overwrite with
- (insecure mode) 2 passes:
- 1x overwrite with
0xff
. - 1x random pass.
- 1x overwrite with
- (totally insecure mode) 1 pass:
- 1x random pass.
- Truncate the file, so that an observer wouldn't know which diskblocks belonged to the file.
- Rename the file.
- Delete (unlink) the file.
In 1 second you can approximately overwrite 1 to 2 MB of data (on a hard disk).
In "totally insecure" mode, in 15 seconds you can approximately overwrite 100 MB of data. The same deletion takes about 60 minutes in totally secure mode.
Usage
secure-delete [-dflrvz] file1 file2 etc.
Options:
-d ignore the two dot special files "." and "..".
-f fast (and insecure mode): no /dev/urandom, no synchronize mode.
-l lessens the security (use twice for total insecure mode).
-r recursive mode, deletes all subdirectories.
-v is verbose mode.
-z last wipe writes zeros instead of random data.
Does a secure overwrite/rename/delete of the target file(s).
Default is secure mode (38 writes).
sorting
A C++ library for sporadic, incremental sorting with client-side comparators.
The main idea of this library is to take sorting algorithms like Quicksort and make them stateless across iterations. The sorting is performed within the "server" one step at a time where all the state information needed to perform the next step in the sort is passed in as an input from a "client." The client must keep track of this state and also perform the binary comparisons requested by the server at each step.
This non-traditional conception of sorting effectively allows a human to be placed in the middle of the sorting loop, dictating the atomic binary comparisons between elements in a sortable set. Since the outcomes of these comparisons dictate the final ordering of the elements from the sorting algorithm, this design provides a natural (and thorough) way for a person to topologically rank arbitrary sets of objects through the cognitively manageable task of successive binary choices of preference. The rankserver experiment is powered by this library.
crowcpp
A minimally-patched fork of Crow, a C++ webserver.
The patch allows one to dynamically specify where the website's assets directory is; a necessary feature for rankserver-cpp.
rankserver-cpp
A portable webserver for RESTfully ranking files via binary manual comparisons.
Spins up a crowcpp webserver (on the specified port) whose purpose is to help a user
rank files in the chosen data-dir
directory via manual binary comparisons. The ranking is done via
an incremental "RESTful" sorting strategy implemented within the sorting library. State
is created and maintained within the data-dir
directory so that the ranking exercise can pick back up
where it left off between different spawnings of the server. At this point, only the ranking of .txt
and
.png
files is possible; other file types in data-dir
will be ignored.
Usage (Auto-Generated)
Options:
-h [ --help ] print usage
-p [ --port ] arg port to serve on (default: 4000)
-d [ --data-dir ] arg data directory to process (default: ./data)
mfn
Simple CLI tool meant to analyze an image of a single person and print whether the person appears to be a male (m), female (f), or neither (n).
Uses vanilla OpenCV tools. Depending on the model, it can be pretty trigger-happy classifying genders even on inanimate objects, so for best results only use images of one person. Neural network model description and weights not included.
Usage
usage: mfn [Options] imgfile
Options:
--model-proto arg gender model description file
--model-weights arg gender model weights file
--imgfile arg image file to process
orchestrator-cpp
C++ implementation of a multi-threaded job manager for my OS.
Under construction
Rust Packages
Packages written in Rust.
manif-geom-rs
Rust implementation of manif-geom-cpp (under construction).
TODO Once finished, these docs will contrast the API with manif-geom-cpp
.
xv-lidar-rs
Daemon for the Neato XV LiDAR (not quite finished).
Written in Rust. Repository
Currently the program will simply continuously print out 2D point cloud data to the console. I plan to instead have it stream gRPC 2D point cloud messages (defined in aapis) to a mscpp-based daemon for real-time pose estimation over SE(2).
Usage (Auto-Generated)
XV LiDAR Interface Daemon
Usage: xv-lidar-rs [OPTIONS]
Options:
-d, --device <DEVICE> Device name [default: /dev/ttyACM0]
-h, --help Print help information
-V, --version Print version information
sunnyside
File scrambler.
Written in Rust. Repository
Usage (Auto-Generated)
Make some scrambled eggs
Usage: sunnyside --target <TARGET> --shift <SHIFT> --key <KEY>
Options:
-t, --target <TARGET> File target
-s, --shift <SHIFT> Shift amount
-k, --key <KEY> Scramble key
-h, --help Print help
-V, --version Print version
Python Packages
Packages written (or bound) in Python.
- aapis-py
- fqt
- find_rotational_conventions
- geometry
- pyceres
- pyceres_factors
- pysorting
- makepyshell
- scrape
- pysignals
- mesh-plotter
- orchestrator
- gmail-parser
- trafficsim
- flask-hello-world
- flask-url2mp4
- flask-mp4server
- flask-mp3server
- flask-smfserver
- flask-oatbox
- rankserver
- stampserver
- easy-google-auth
- rcdo
- task-tools
- photos-tools
- wiki-tools
- book-notes-sync
- goromail
aapis-py
Python bindings for my custom APIs.
fqt
Four-quadrant tasking.
This little CLI tool will suggest classes of activities to do based on configured priorities and preferences.
Example config file:
Framework Learning:25
Programming Projects:25
The Arts:10
Fun:10
Family:30
Usage (Auto-Generated)
Usage: fqt [OPTIONS] COMMAND [ARGS]...
Four-quadrants tasking tools.
Options:
--config-file PATH Path to the config file. [default: ~/fqt/config]
--log-file PATH Path to the log file. [default: ~/fqt/log]
--help Show this message and exit.
Commands:
analyze Analyze past task performance.
task Propose a task for the day.
find_rotational_conventions
Find rotational conventions of a Python transform library.
Conventions are defined in my notes on rotations. Example deduction of conventions used in the geometry library:
from find_rotational_conventions import (
find_euler_conventions,
find_axis_angle_conventions,
find_quaternion_conventions,
)
import numpy as np
from typing import Tuple
from geometry import SO3 # https://github.com/goromal/geometry
LIBNAME = "manif-geom-cpp/geometry" # Library being tested
def euler2R(arg1: float, arg2: float, arg3: float) -> np.ndarray:
return SO3.fromEuler(arg1, arg2, arg3).R()
def axisAngle2R(axis: np.ndarray, angle: float) -> np.ndarray:
return SO3.fromAxisAngle(axis, angle).R()
def quat2R(q1: float, q2: float, q3: float, q4: float) -> np.ndarray:
return SO3.fromQuat(q1, q2, q3, q4).R()
def quatComp(
q1: Tuple[float, float, float, float],
q2: Tuple[float, float, float, float]
) -> Tuple[float, float, float, float]:
q = SO3.fromQuat(*q1) * SO3.fromQuat(*q2)
return (q.w(), q.x(), q.y(), q.z())
find_axis_angle_conventions(LIBNAME, axisAngle2R)
find_euler_conventions(LIBNAME, euler2R)
find_quaternion_conventions(LIBNAME, quat2R, quatComp)
Yields the output:
Axis-Angle Conventions for manif-geom-cpp/geometry:
Rodrigues Directionality: Body-to-World
Euler Angle Conventions for manif-geom-cpp/geometry:
Euler Argument Order: ['x', 'y', 'z']
Euler Matrix Order: R = R(z)R(y)R(x)
Euler Directionality: Body-to-World
Quaternion Conventions for manif-geom-cpp/geometry:
Quaternion Ordering: Scalar First
Quaternion Handedness: Right-Handed
Quaternion Function: Passive
Quaternion Directionality: Body-to-World
Usage (Auto-Generated)
NOTE: You're running find_rotational_conventions.py standalone, but it's most useful as an import to deduce the rotational conventions of the particular library that you're using.
Running an example from the following source code:
=========
test.py |=================================================================
========= |
from find_rotational_conventions import ( |
find_euler_conventions, |
find_axis_angle_conventions, |
find_quaternion_conventions, |
) |
import numpy as np |
from typing import Tuple |
from geometry import SO3 # https://github.com/goromal/geometry |
|
LIBNAME = "manif-geom-cpp/geometry" # Library being tested |
|
def euler2R(arg1: float, arg2: float, arg3: float) -> np.ndarray: |
return SO3.fromEuler(arg1, arg2, arg3).R() |
|
def axisAngle2R(axis: np.ndarray, angle: float) -> np.ndarray: |
return SO3.fromAxisAngle(axis, angle).R() |
|
def quat2R(q1: float, q2: float, q3: float, q4: float) -> np.ndarray: |
return SO3.fromQuat(q1, q2, q3, q4).R() |
|
def quatComp( |
q1: Tuple[float, float, float, float], |
q2: Tuple[float, float, float, float] |
) -> Tuple[float, float, float, float]: |
q = SO3.fromQuat(*q1) * SO3.fromQuat(*q2) |
return (q.w(), q.x(), q.y(), q.z()) |
|
find_axis_angle_conventions(LIBNAME, axisAngle2R) |
find_euler_conventions(LIBNAME, euler2R) |
find_quaternion_conventions(LIBNAME, quat2R, quatComp) |
|
==========================================================================
With output:
Axis-Angle Conventions for manif-geom-cpp/geometry:
Rodrigues Directionality: Body-to-World
Euler Angle Conventions for manif-geom-cpp/geometry:
Euler Argument Order: ['x', 'y', 'z']
Euler Matrix Order: R = R(z)R(y)R(x)
Euler Directionality: Body-to-World
Quaternion Conventions for manif-geom-cpp/geometry:
Quaternion Ordering: Scalar First
Quaternion Handedness: Right-Handed
Quaternion Function: Passive
Quaternion Directionality: Body-to-World
geometry
Implementations for SO(3) and SE(3).
Python-wrapped version of the C++ manif-geom-cpp library.
Example Usage
Example usage of SO3:
# action
q = SO3.random()
v = np.random.random(3)
qv1 = q * v
qv2 = q.R().dot(v)
assert np.allclose(qv1, qv2)
# inversion and composition
qI = SO3.identity()
q1 = SO3.random()
q1i = q1.inverse()
q1I = q1 * q1i
assert np.allclose(qI.array(), q1I.array())
# Euler conversions
roll = -1.2
pitch = 0.6
yaw = -0.4
q = SO3.fromEuler(roll, pitch, yaw)
rpy = q.toEuler()
assert np.isclose(roll, rpy[0]) and np.isclose(pitch, rpy[1]) and np.isclose(yaw, rpy[2])
# plus / minus
R1 = SO3.random()
w = np.array([0.5, 0.2, 0.1])
R2 = R1 + w
w2 = R2 - R1
assert np.allclose(w, w2)
# chart maps
q = SO3.random()
w = np.random.random(3)
qlog = SO3.Log(q)
q2 = SO3.Exp(qlog)
assert np.allclose(q.array(), q2.array())
wexp = SO3.Exp(w)
w2 = SO3.Log(wexp)
assert np.allclose(w, w2)
# scaling
qI = SO3.identity()
qIs = 5.0 * qI
assert np.allclose(qI.array(), qIs.array())
qr = SO3.random()
qr2 = qr * 0.2
qr3 = qr2 / 0.2
assert np.allclose(qr.array(), qr3.array())
pyceres
Python bindings for the Ceres Solver.
Tutorial on how to use the library in conjunction with pyceres_factors and geometry.
pyceres_factors
Python bindings of ceres-factors.
Tutorial on how to use the library in conjunction with pyceres and geometry.
pysorting
RESTful incremental sorting with client-side comparators.
This library is a Python-wrapped version of the C++ sorting library. As such, it is meant to be used in conjunction with a client that can solicit answers to binary comparisons for the purpose of incremental sorting.
Example usage in a Python script:
# key-value pairs to be sorted by values
values = {0: 4.8, 1: 10.0, 2: 1.0, 3: 2.5, 4: 5.0}
state = QuickSortState()
state.n = 5
state.arr = [i for i in values.keys()]
state.stack = [0 for i in range(state.n)]
# validateState(state)
# proxy for user choices from some client; this will simply choose the larger
# value, resulting in an ascending sort
def updateComparator(a, b):
if a < b:
return int(ComparatorResult.LEFT_LESS)
elif a > b:
return int(ComparatorResult.LEFT_GREATER)
else:
return int(ComparatorResult.LEFT_EQUAL)
# simulate user choices until the list is sorted
iter = 0
maxIters = 50
while not (state.top == UINT32_MAX and state.c != 0) and iter < maxIters:
iter_success, state_out = restfulQuickSort(state)
state = state_out
if state.l == int(ComparatorLeft.I):
state.c = updateComparator(values[state.arr[state.i]], values[state.arr[state.p]])
elif state.l == int(ComparatorLeft.J):
state.c = updateComparator(values[state.arr[state.j]], values[state.arr[state.p]])
iter += 1
# sorted keys
# state.arr == [2, 3, 0, 4, 1]
makepyshell
Generate a nix-shell file for Python development.
Usage (Auto-Generated)
usage: makepyshell [-h] [--nix-path NIX_PATH]
[--modules MODULES [MODULES ...]]
Generate a nix-shell file (shell.nix) for Python 3.9 development.
options:
-h, --help show this help message and exit
--nix-path NIX_PATH Nix source path. (default: nixpkgs)
--modules MODULES [MODULES ...]
Python modules for development. (default: [])
scrape
Scrape content off the internet, quickly.
This is a simple tool that assumes you want to download files from a straightforwardly-constructed HTML page. You'll need an XPath specification to help narrow down the scraping.
Resource files for testing scrape:
sample_960x400_ocean_with_audio
Usage (Auto-Generated)
usage: scrape [-h] [--xpath XPATH] [--ext EXT] [-o DIRNAME]
{simple-link-scraper,simple-image-scraper} page
Scrape content off the internet, quickly.
positional arguments:
{simple-link-scraper,simple-image-scraper}
The type of content to be scraped.
page Webpage url.
options:
-h, --help show this help message and exit
--xpath XPATH Optionally specify the XPath
--ext EXT Optionally specify the file extension
-o DIRNAME, --output DIRNAME
Output directory.
pysignals
Python bindings of signals-cpp.
mesh-plotter
Tools for plotting transforms and line meshes in Python.
Example usage:
import numpy as np
from geometry import SO3, SE3
from mesh_plotter.meshes import Axes3DMesh
from mesh_plotter.animator_3d import Animator3D
times = [0.0, 5.0, 10.0, 15.0]
se3_1 = SE3.identity()
se3_2 = SE3.fromVecAndQuat(np.array([2.0,0.0,0.0]),
SO3.fromEuler(1.5,-2.0,0.2))
se3_3 = SE3.fromVecAndQuat(np.array([-1.5,-1.5,1.5]),
SO3.random())
transforms1 = [se3_1]*4
transforms2 = [se3_2]*4
transforms3 = [se3_3]*4
transforms = [se3_1, se3_2, se3_3, se3_1]
animator = Animator3D(xlim=(-2.5,2.5), ylim=(-2.5,2.5), zlim=(0.,2.))
animator.addMeshSequence(Axes3DMesh(scale=0.4), transforms1, times)
animator.addMeshSequence(Axes3DMesh(scale=0.4), transforms2, times)
animator.addMeshSequence(Axes3DMesh(scale=0.4), transforms3, times)
animator.addMeshSequence(Axes3DMesh(), transforms, times)
animator.animate(dt = 0.1)
orchestrator
Daemon + CLI for managing select background tasks on my computer.
Work in progress. Detailed description to come.
gmail-parser
Assorted Python tools for semi-automated processing of GMail messages.
This package may be used either in CLI form or via an interactive Python shell.
Interactive Shell
Import with
from gmail_parser.corpus import GMailCorpus
Deleting promotions and social network emails:
inbox = GMailCorpus('your_email@gmail.com').Inbox(1000)
inbox.clean()
inbox = GMailCorpus('your_email@gmail.com').Inbox(1000)
Get all senders of unread emails:
unread = inbox.fromUnread()
print(unread.getSenders())
Read all unread emails from specific senders:
msgs = unread.fromSenders(['his@email.com', 'her@email.com']).getMessages()
for msg in msgs:
print(msg.getText())
Mark an entire sub-inbox as read:
subInbox.markAllAsRead()
Usage (Auto-Generated)
Usage: gmail-manager [OPTIONS] COMMAND [ARGS]...
Manage GMail.
Options:
--gmail-secrets-json PATH GMail client secrets file. [default:
~/secrets/google/client_secrets.json]
--gmail-refresh-file PATH GMail refresh file (if it exists). [default:
~/secrets/google/refresh.json]
--gbot-refresh-file PATH GBot refresh file (if it exists). [default:
~/secrets/google/bot_refresh.json]
--journal-refresh-file PATH Journal refresh file (if it exists). [default:
~/secrets/google/journal_refresh.json]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--help Show this message and exit.
Commands:
clean Clean out promotions and social emails.
gbot-send Send an email from GBot.
journal-send Send an email from Journal.
send Send an email.
Usage: gmail-manager clean [OPTIONS]
Clean out promotions and social emails.
Options:
--num-messages INTEGER Number of messages to poll before cleaning.
[default: 1000]
--help Show this message and exit.
Usage: gmail-manager send [OPTIONS] RECIPIENT SUBJECT BODY
Send an email.
Options:
--help Show this message and exit.
Usage: gmail-manager gbot-send [OPTIONS] RECIPIENT SUBJECT BODY
Send an email from GBot.
Options:
--help Show this message and exit.
Usage: gmail-manager journal-send [OPTIONS] RECIPIENT SUBJECT BODY
Send an email from Journal.
Options:
--help Show this message and exit.
trafficsim
Simulate traffic.
Simple traffic simulator on a circular road. Cars have two control objectives: maintain a consistent distance between cars and maintain a consistent car speed.
Usage (Auto-Generated)
usage: trafficsim [-h] [--num_cars NUM_CARS] [--vel_des VEL_DES]
[--vel_max VEL_MAX] [--beta_mu BETA_MU]
[--beta_sigma BETA_SIGMA] [--gamma_mu GAMMA_MU]
[--gamma_sigma GAMMA_SIGMA]
[--vel_col_thresh VEL_COL_THRESH]
[--pos_col_thresh POS_COL_THRESH]
Simple traffic simulator on a circular road.
options:
-h, --help show this help message and exit
--num_cars NUM_CARS Number of cars to simulate. (default: 7)
--vel_des VEL_DES Desired car velocity. (default: 1.0)
--vel_max VEL_MAX Maximum allowed car velocity. (default: 1.5)
--beta_mu BETA_MU Mean proportional gain. (default: 0.5)
--beta_sigma BETA_SIGMA
Proportional gain standard deviation. (default: 0.5)
--gamma_mu GAMMA_MU Mean derivative gain. (default: 0.5)
--gamma_sigma GAMMA_SIGMA
Derivative gain standard deviation. (default: 0.5)
--vel_col_thresh VEL_COL_THRESH
Distance threshold for slowing down. (default: 0.3)
--pos_col_thresh POS_COL_THRESH
Distance threshold for collision avoidance (default:
0.15)
flask-hello-world
Spawn a trivial website, powered by Python's flask library.
Usage (Auto-Generated)
usage: flask_hello_world [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
flask-url2mp4
Convert URL's pointing to videos to MP4's, powered by Python's flask library.
The server page takes a URL string and either uses wget
or youtube-dl
to download the video and convert it to MP4 using the mp4 tool.
Usage (Auto-Generated)
usage: flask_url2mp4 [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
flask-mp4server
Spawn an MP4 conversion server, powered by Python's flask library.
The server page takes an input video file and converts it to an MP4 using the mp4 tool.
Usage (Auto-Generated)
usage: flask_mp4server [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
flask-mp3server
Spawn an MP3 conversion server, powered by Python's flask library.
The server page takes an input audio file and converts it to an MP3 using the mp3 tool. One can also specify a frequency transpose in terms of positive or negative half-steps.
Usage (Auto-Generated)
usage: flask_mp3server [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
flask-smfserver
Spawn an SMF "simple music file" conversion server, powered by Python's flask library.
The server page presents a text input area where you can type a song as specified by the simplified SMF music specification language:
- Notes are typed as letters with spaces between them.
- All notes are assumed to be the 4th octave unless a number is given after the letter.
- All notes are assumed to be quarter notes unless a number (multiplier) is given before the letter (or group of letters):
- /3
- /2
- 2
- 4
- Accidentals are typed immediately after a letter:
- ^ = sharp
- _ = flat
- The dash (-) means a rest. Use it like a letter.
- Multiple letters in a group form a chord.
- Begin a line with a number and colon (e.g., 1:) to specify a unique voice.
Under the hood, conversions to MP3 are done using the abc and mp3 tools.
Usage (Auto-Generated)
usage: flask_smfserver [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
flask-oatbox
"One at a time" (O.A.T.) Box. Store one file at a time, powered by Python's flask library.
This tool gives you a method to store, extract, and replace files (again, one at a time) in the directory from which the tool is run.
Usage (Auto-Generated)
usage: flask_oatbox [-h] [--port PORT]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
rankserver
A portable webserver for ranking files via binary manual comparisons, powered by Python's flask library.
Spins up a flask webserver (on the specified port) whose purpose is to help a user rank files in the chosen data-dir
directory via manual binary comparisons. The ranking is done via an incremental "RESTful" sorting strategy implemented within the pysorting library. State is created and maintained within the data-dir
directory so that the ranking exercise can pick back up where it left off between different spawnings of the server. At this point, only the ranking of .txt
and .png
files is possible; other file types in data-dir
will be ignored.
Usage (Auto-Generated)
usage: rankserver [-h] [--port PORT] [--data-dir DATA_DIR]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
--data-dir DATA_DIR Directory containing the rankable elements
stampserver
Provides an interface for stamping metadata on PNGs and MP4s.
Usage (Auto-Generated)
usage: stampserver [-h] [--port PORT] [--data-dir DATA_DIR]
options:
-h, --help show this help message and exit
--port PORT Port to run the server on
--data-dir DATA_DIR Directory containing the stampable elements
easy-google-auth
Convenience library for abstracting away Google API authorization protocols for my various Python libraries that use it.
Used as the authorization source for:
rcdo
Run commands on remote machines.
Usage (Auto-Generated)
Usage: rcdo [OPTIONS] REMOTE_HOST CMD COMMAND [ARGS]...
Run a local command `cmd` on a remote machine.
`remote_host` can be a single or multi-step hop, e.g.,
user@hostname:password
user1@hostname1:password+user2@hostname2:password+...
Options:
-i, --input TEXT Remote file(s) to grab.
-o, --output TEXT Local file(s) to create.
--ssh-config TEXT Path to SSH config file. [default: ~/.ssh/config]
-v, --verbose Print out diagnostic information.
--help Show this message and exit.
Commands:
local The command is from your local machine.
remote The command is from the remote machine.
task-tools
CLI tools for managing Google Tasks.
Usage (Auto-Generated)
Usage: task-tools [OPTIONS] COMMAND [ARGS]...
Manage Google Tasks.
Options:
--task-secrets-file PATH Google Tasks client secrets file. [default:
~/secrets/google/client_secrets.json]
--task-refresh-token PATH Google Tasks refresh file (if it exists).
[default: ~/secrets/google/refresh.json]
--task-list-id TEXT UUID of the Task List to query. [default:
MDY2MzkyMzI4NTQ1MTA0NDUwODY6MDow]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--help Show this message and exit.
Commands:
clean Delete / clean up failed timed tasks.
delete Delete a particular task by UUID.
grader Generate a CSV report of how consistently tasks have been...
list List pending tasks according to a filter ∈ [all, p0, p1, p2,...
put Upload a task.
put-spec Read a CSV of task specifications and idempotently put them...
Usage: task-tools list [OPTIONS] FILTER
List pending tasks according to a filter ∈ [all, p0, p1, p2, p3, late,
ranked].
Options:
--date [%Y-%m-%d] Maximum due date for filtering tasks. [default:
2024-11-20]
--no-ids Don't show the UUIDs.
--help Show this message and exit.
Usage: task-tools delete [OPTIONS] TASK_ID
Delete a particular task by UUID.
Options:
--help Show this message and exit.
Usage: task-tools put [OPTIONS]
Upload a task.
Options:
--name TEXT Name of the task. [required]
--notes TEXT Notes to add to the task description.
--date [%Y-%m-%d] Task due date. [default: 2024-11-20]
--help Show this message and exit.
Usage: task-tools grader [OPTIONS]
Generate a CSV report of how consistently tasks have been completed within
the specified window.
Grading criteria:
- P0: ... tasks must be completed same day.
- P1: ... tasks must be completed within a week.
- P2: ... tasks must be completed within a month.
- P3: ... tasks must be completed within 90 days.
Deletion / failure criteria:
- P[0-3]: [T] ... tasks that have not be completed within the appropriate
window.
P0 manually generated tasks will be migrated to the current day.
Options:
--start-date [%Y-%m-%d] First day of the grading window. [default:
2024-11-13]
--end-date [%Y-%m-%d] Last day of the grading window. [default:
2024-11-20]
-o, --out PATH CSV file to generate the report in. [default:
~/data/task_grades/log.csv]
--dry-run Do a dry run; no task deletions.
--help Show this message and exit.
photos-tools
CLI tools for managing Google Photos.
For your photos management, follow these steps:
- Favorite only the media that you would like to "thin out"
- On a computer with space, run the clean method
- Move the whole Favorites directory to the trash
Usage (Auto-Generated)
Usage: photos-tools [OPTIONS] COMMAND [ARGS]...
Manage Google Photos.
Options:
--photos-secrets-file PATH Google Photos client secrets file. [default:
~/secrets/google/client_secrets.json]
--photos-refresh-token PATH Google Photos refresh file (if it exists).
[default: ~/secrets/google/refresh.json]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--help Show this message and exit.
Commands:
clean Download favorited photos so that you can later delete them from...
Usage: photos-tools clean [OPTIONS]
Download favorited photos so that you can later delete them from the cloud.
Options:
--output-dir PATH Directory to download the media to.
--dry-run Dry run only.
--help Show this message and exit.
wiki-tools
CLI tools for managing my wiki notes site.
Usage (Auto-Generated)
Usage: wiki-tools [OPTIONS] COMMAND [ARGS]...
Read and edit DokuWiki instance pages.
Options:
--url TEXT URL of the DokuWiki instance (https). [default:
https://notes.andrewtorgesen.com]
--secrets-file PATH Path to the DokuWiki login secrets JSON file.
[default: ~/secrets/wiki/secrets.json]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--help Show this message and exit.
Commands:
get Read the content of a DokuWiki page.
get-md Read the content of a DokuWiki page in Markdown format.
get-rand-journal Get a random journal entry between 2013 and now.
put Put content onto a DokuWiki page.
put-dir Put a directory of pages into a DokuWiki namespace.
put-md Put Markdown content onto a DokuWiki page.
put-md-dir Put a directory of Markdown pages into a DokuWiki...
Usage: wiki-tools get [OPTIONS]
Read the content of a DokuWiki page.
Options:
--page-id TEXT ID of the DokuWiki page. [required]
--output TEXT Output text file name. Will print to terminal if not
specified.
--help Show this message and exit.
Usage: wiki-tools get-md [OPTIONS]
Read the content of a DokuWiki page in Markdown format.
Options:
--page-id TEXT ID of the DokuWiki page. [required]
--output TEXT Output Markdown file name. Will print to terminal if not
specified.
--help Show this message and exit.
Usage: wiki-tools get-rand-journal [OPTIONS]
Get a random journal entry between 2013 and now.
Options:
--namespace TEXT Journal pages namespace. [default: journals]
--output TEXT Output text file name. Will print to terminal if not
specified.
--help Show this message and exit.
Usage: wiki-tools put [OPTIONS]
Put content onto a DokuWiki page.
Options:
--page-id TEXT ID of the DokuWiki page. [required]
--file PATH File containing the target content.
--content TEXT Content to put on the page if file is not specified. NOTE:
This argument is mutually exclusive with content_file
--help Show this message and exit.
Usage: wiki-tools put-dir [OPTIONS]
Put a directory of pages into a DokuWiki namespace.
Options:
--pages-dir DIRECTORY Directory with .txt pages to upload. [required]
--namespace TEXT Namespace to upload the pages to. [required]
--help Show this message and exit.
Usage: wiki-tools put-md [OPTIONS]
Put Markdown content onto a DokuWiki page.
Options:
--page-id TEXT ID of the DokuWiki page. [required]
--file PATH Markdown file containing the target content.
--content TEXT Markdown content to put on the page if file is not
specified. NOTE: This argument is mutually exclusive with
content_file
--help Show this message and exit.
Usage: wiki-tools put-md-dir [OPTIONS]
Put a directory of Markdown pages into a DokuWiki namespace.
Options:
--pages-dir DIRECTORY Directory with .txt pages to upload. [required]
--namespace TEXT Namespace to upload the pages to. [required]
--help Show this message and exit.
book-notes-sync
Utility for syncing Google Play Books notes with my personal wiki.
Usage (Auto-Generated)
Usage: book-notes-sync [OPTIONS] COMMAND [ARGS]...
Synchronize Google Docs book notes with corresponding DokuWiki notes.
Options:
--docs-secrets-file PATH Google Docs client secrets file. [default:
~/secrets/google/client_secrets.json]
--docs-refresh-token PATH Google Docs refresh file (if it exists).
[default: ~/secrets/google/refresh.json]
--wiki-url TEXT URL of the DokuWiki instance (https). [default:
https://notes.andrewtorgesen.com]
--wiki-secrets-file TEXT Path to the DokuWiki login secrets JSON file.
[default: ~/secrets/wiki/secrets.json]
--enable-logging BOOLEAN Whether to enable logging. [default: True]
--help Show this message and exit.
Commands:
sync Sync a single Google Doc with a single DokuWiki page.
sync-from-csv Sync a list of Google Docs with DokuWiki pages from a CSV.
Usage: book-notes-sync sync [OPTIONS]
Sync a single Google Doc with a single DokuWiki page.
Options:
--docs-id TEXT Document ID of the Google Doc. [required]
--page-id TEXT ID of the DokuWiki page. [required]
--help Show this message and exit.
Usage: book-notes-sync sync-from-csv [OPTIONS]
Sync a list of Google Docs with DokuWiki pages from a CSV.
Options:
--sync-csv PATH CSV specifying (docs-id, page-id) pairs. [default:
~/configs/book-notes.csv]
--help Show this message and exit.
goromail
Manage mail for GBot and Journal.
The following workflows are supported, all via text messaging:
GBot (goromal.bot@gmail.com):
- Calorie counts via a solo number (e.g.,
100
) - Tasks via the keywords
P[0-3]:
P0
= "Must do today"P1
= "Must do within a week"P2
= "Must do within a month"P3
= "Should do eventually"
- Keyword matchers for routing to specific Wiki pages, which are configurable via a CSV file passed to the
bot
command:KEYWORD: [P0-1:] ...
Sort KEYWORD. [P0-1:] ...
- ITNS additions via any other pattern
Journal (goromal.journal@gmail.com):
- Any pattern will be added to the journal according to the date in which the message was sent unless prepended by the string
mm/dd/yyyy:
.
Usage (Auto-Generated)
Usage: goromail [OPTIONS] COMMAND [ARGS]...
Manage the mail for GBot and Journal.
Options:
--gmail-secrets-json PATH GMail client secrets file. [default:
~/secrets/google/client_secrets.json]
--gbot-refresh-file PATH GBot refresh file (if it exists). [default:
~/secrets/google/bot_refresh.json]
--journal-refresh-file PATH Journal refresh file (if it exists). [default:
~/secrets/google/journal_refresh.json]
--num-messages INTEGER Number of messages to poll for GBot and Journal
(each). [default: 1000]
--wiki-url TEXT URL of the DokuWiki instance (https).
[default: https://notes.andrewtorgesen.com]
--wiki-secrets-file PATH Path to the DokuWiki login secrets JSON file.
[default: ~/secrets/wiki/secrets.json]
--task-secrets-file PATH Google Tasks client secrets file. [default:
~/secrets/google/client_secrets.json]
--notion-secrets-file PATH Notion client secrets file. [default:
~/secrets/notion/secret.json]
--task-refresh-token PATH Google Tasks refresh file (if it exists).
[default: ~/secrets/google/refresh.json]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--headless Whether to run in headless (i.e., server) mode.
--headless-logdir PATH Directory in which to store log files for
headless mode. [default: ~/goromail]
--help Show this message and exit.
Commands:
annotate-triage-pages Re-title triage pages based on content.
bot Process all pending bot commands.
journal Process all pending journal entries.
Usage: goromail bot [OPTIONS]
Process all pending bot commands.
Options:
--categories-csv PATH CSV that maps keywords to notion pages. [default:
~/configs/goromail-categories.csv]
--dry-run Do a dry run; no message deletions.
--help Show this message and exit.
Usage: goromail journal [OPTIONS]
Process all pending journal entries.
Options:
--dry-run Do a dry run; no message deletions.
--help Show this message and exit.
Bash Packages
Packages written (or glued together) in Bash.
- aapis-grpcurl
- aptest
- authm
- abc
- getres
- mp3
- mp4
- png
- svg
- color-prints
- dirgroups
- dirgather
- manage-gmail
- gantter
- la-quiz
- budget_report
- md2pdf
- notabilify
- fix-perms
- make-title
- pb
- code2pdf
- cpp-helper
- py-helper
- rust-helper
- mp4unite
- git-cc
- git-shortcuts
- setupws
- listsources
- pkgshell
- devshell
- providence
- providence-tasker
- fixfname
- nix-deps
- nix-diffs
- anix-version
- anix-upgrade
- anix-changelog-compare
- flake-update
- rcrsync
aapis-grpcurl
Interact with gRPC servers using custom APIs.
A wrapped version of the grpcurl tool that points to my custom API definitions.
Since I use gRPC for inter-process communication for most simulated robot platform personal projects, this is a useful CLI tool for debugging.
aptest
Run a SITL instance of ardupilot from source.
Sample Commands (for heli)
- If you're running LUA scripts and have some in an
ardupilot/scripts
directory:param set SCR_ENABLE 1
reboot
param set DISARM_DELAY 0
mode guided
arm throttle
takeoff 25
Usage (Auto-Generated)
usage: aptest [options] path_to_ardupilot
Run a SITL instance of ardupilot from source.
Options:
-f|--frame Copter frame to simulate [default: heli]
authm
Manage secrets.
Usage (Auto-Generated)
Usage: authm [OPTIONS] COMMAND [ARGS]...
Manage secrets.
Options:
--help Show this message and exit.
Commands:
refresh Refresh all auth tokens one-by-one.
validate Validate the secrets files present on the filesystem.
Usage: authm refresh [OPTIONS]
Refresh all auth tokens one-by-one.
Options:
--headless Run in headless mode.
--force Force the auth files to be re-written. If headless, run a
headless refresh.
--help Show this message and exit.
Usage: authm validate [OPTIONS]
Validate the secrets files present on the filesystem.
Options:
--help Show this message and exit.
abc
Generate abc music files from similar formats.
Usage (Auto-Generated)
usage: abc inputfile outputfile
Create an abc file.
Inputs:
.smf
.midi
getres
Get the screen resolution of this computer.
usage: getres [opts]
Get the screen resolution of this computer.
Options:
-v|--verbose Print diagnostic information
--no-fail Fall back to a reasonable default resolution if this
computer's resolution can't be deduced
mp3
Generate (or modify) an MP3 file from similar formats.
Usage (Auto-Generated)
usage: mp3 inputfile outputfile
Create a mp3 file.
Inputs:
.mp3
.mp4
.wav
.abc
Options:
--transpose [+- # HALF STEPS]
Powered by https://github.com/breakfastquay/rubberband.
--TODO
mp4
Generate and edit MP4 video files using ffmpeg
.
Usage (Auto-Generated)
usage: mp4 inputfile outputfile
Create a mp4 file.
Inputs:
.mp4
.gif
.mpeg
.mkv
.mov
.avi
.webm
Options:
-v | --verbose Print verbose output from ffmpeg
-m | --mute Remove audio
-q | --quality CHAR - for low, = for medium, + for high bit rate quality
-w | --width WIDTH Constrain the video width (pixels)
-l | --label "STR" Add label to bottom left corner of video
-f | --fontsize INT Font size for added text
-c | --crop INT:INT:INT:INT Crop video (pre-labeling) W:H:X:Y
-s | --start TIME INITIAL time: [HH:]MM:SS[.0]
-e | --end TIME FINAL time: [HH:]MM:SS[.0]
png
Generate PNG images from a variety of similar formats.
Usage (Auto-Generated)
usage: png inputfile outputfile
Create a png file.
Inputs:
.png
.gif
.svg
.jpeg
.heic
.tiff
Options:
-r|--resize [e.g., 50%] Resize the image.
-s|--scrub Scrub image metadata.
svg
Generate and edit SVG files from a variety of source formats.
Usage (Auto-Generated)
usage: svg inputfile outputfile
Create an svg file.
Inputs:
.svg
.abc
.pdf
Options:
--crop | svg [x] abc [ ] pdf [x]
--rmtext | svg [x] abc [ ] pdf [x]
--poppler | svg [x] abc [ ] pdf [x]
--scour | svg [x] abc [x] pdf [x]
--rmwhite | svg [x] abc [x] pdf [x]
color-prints
Color-formatted wrapped echo
commands.
ANSI color codes referenced from Wikipedia.
echo_black
echo_red
echo_green
echo_yellow
echo_blue
echo_magenta
echo_cyan
echo_white
dirgroups
Split directories into smaller ones.
usage: dirgroups num_groups dir
OR
dirgroups --of group_size dir
Split a large directory of files into smaller directories with evenly distributed files (not counting remainders).
dirgather
Gather all files in a directory tree into a single directory.
usage: dirgather [options] rootdir gatherdir
Recursively take all files in rootdir's tree and gather them in a new gatherdir. Subsequently, clean up all empty directories in rootdir.
Options:
--dry-run Perform a dry run
manage-gmail
Interactively manage your GMail inbox from the command line.
Powered by gmail-parser.
Usage (Auto-Generated)
usage: manage-gmail
Enter an interactive shell for managing a GMail inbox.
Examples:
[Deleting promotions and social network emails]
>> baseInbox = GMailCorpus('your_email@gmail.com').Inbox(1000)
>> baseInbox.clean()
>> baseInbox = GMailCorpus('your_email@gmail.com').Inbox(1000)
[Get all senders of unread emails]
>> unreadInbox = baseInbox.fromUnread()
>> print(unreadInbox.getSenders())
[Read all unread emails from specific senders]
>> msgs = unreadInbox.fromSenders(['his@email.com', 'her@email.com']).getMessages()
>> for msg in msgs:
>> print(msg.getText())
[Mark an entire sub-inbox as read]
>> subInbox.markAllAsRead()
gantter
Generate Gantt charts from text files.
usage: gantter specfile
Create a Gantt-based dependency chart for tasks, laid out by the specfile.
Example specfile contents:
--------------------------------------------------------------------------
1>> Coverage Planner
1.1>> Learn interface for outer loop
1.2>> [[1.1]] ((2)) Connect Lab 4 code with outer loop
1.3>> [[2.1]] [[3.1]] ((3)) Waiting for SLAM
2>> SLAM Algorithm
2.1>> Something
3>> System-Level Evaluation
3.1>> Another thing
--------------------------------------------------------------------------
Double brackets [[]] indicate dependencies and double parentheses (())
indicate estimated time units required (assumes 1 if none given).
REQUIRES pdflatex to be in your system path (not interested in shipping
texlive-full in its entirety with this little tool).
la-quiz
Spawn a LA geography quiz.
usage: la-quiz [options] [N|C|E|S]
Spawn a LA geography quiz! Will pull up the general region you specify:
N = North
C = Central
E = East
S = South
Options:
--debug|-d Open in debug mode (will print click positions to the screen).
NOTE: This program assumes that you have the place location JSON files stored in
~/games/la-quiz/GLAA-C.json
GLAA-E.json
GLAA-N.json
GLAA-S.json
budget_report
Generate a budget report.
Usage (Auto-Generated)
Usage: budget_report [OPTIONS] COMMAND [ARGS]...
Tools for Budget Management.
Options:
--secrets-json PATH Client secrets file. [default:
~/secrets/google/client_secrets.json]
--refresh-file PATH Refresh file (if it exists). [default:
~/secrets/google/refresh.json]
--config-json PATH Budget tool config file. [default:
~/configs/budget-tool.json]
--enable-logging BOOLEAN Whether to enable logging. [default: False]
--help Show this message and exit.
Commands:
transactions-bin Bin all transactions from a category sheet.
transactions-process Process raw transactions.
transactions-status Get the status of raw transactions.
transactions-upload Upload missing raw transactions to the budget sheet.
Usage: budget_report transactions-process [OPTIONS]
Process raw transactions.
Options:
--dry-run Activate dry run mode.
--help Show this message and exit.
Usage: budget_report transactions-status [OPTIONS]
Get the status of raw transactions.
Options:
--help Show this message and exit.
Usage: budget_report transactions-upload [OPTIONS]
Upload missing raw transactions to the budget sheet.
Options:
--raw-csv PATH Raw CSV file with transactions. [required]
--account TEXT Account type from the config file. [required]
--dry-run Activate dry run mode.
--help Show this message and exit.
Usage: budget_report transactions-bin [OPTIONS]
Bin all transactions from a category sheet.
Options:
--category TEXT Category from the Config sheet. [required]
--help Show this message and exit.
md2pdf
Convert Markdown files into formatted PDF files, powered by LaTeX.
Usage (Auto-Generated)
usage: md2pdf input.md output.pdf
Use LaTeX to convert a markdown file into a formatted pdf.
notabilify
Make any PDF document suitable for note taking in e.g., Notability.
Usage (Auto-Generated)
usage: notabilify input.pdf output.pdf
Takes a portrait PDF file and adds a large blank space to the right of every page for taking notes.
fix-perms
Recursively claim ownership of all files and folders in dir.
usage: fix-perms dir
Attempts to deduce special cases such as ~/.ssh/*
.
make-title
Print decorated titles.
usage: make-title [options] title
Prints out a decorated title.
Options:
-h | --help Print out the help documentation.
-c | --color One of [black|red|green|yellow|blue|magenta|cyan|white].
Arguments:
title word or phrase making up the title
Example:
$ make-title "Hello, World"
========================
===== Hello, World =====
========================
pb
Print out a progress bar.
usage: pb [options] iternum itertot
Prints a progress bar.
Options:
-h | --help Print out the help documentation.
-b | --barsize Dictate the total progress bar length in chars (Default: 20).
-c | --color One of [black|red|green|yellow|blue|magenta|cyan|white].
Arguments:
iternum: current iteration number
itertot: number of total iterations
Example usage:
N=0
T=20
while [ \$N -le \$T ]; do
pb \$N \$T
N=\$[\$N+1]
sleep 1
done
echo
code2pdf
Generate pretty-printed PDF files from source code files.
Usage (Auto-Generated)
usage: code2pdf infile output.pdf
Convert plain text code infile to color-coded pdf outfile.
Recursive search example for C++ files:
for f in $(find . -name '*.cpp' -or -name '*.h'); do code2pdf $f $f.pdf; done
cpp-helper
Convenience tools for setting up C++ projects.
Usage (Auto-Generated)
usage: cpp-helper [options]
Options:
exec-lib CPPNAME Generate a lib+exec package template
header-lib CPPNAME Generate a header-only library template
format-file Dumps a format rules file into .clang-format
nix Dump template shell.nix file
vscode Generate VSCode C++ header detection settings file
make TARGET|all Full CMake build command (run from repo root)
challenge TARGET|all Full CMake build command WITH SANITIZERS
(run from repo root)
py-helper
Developer tools for creating Python packages.
usage: py-helper [options]
Options:
--make-pkg NAME Generate a template python package
--make-pybind-lib NAME,CPPNAME Generate a pybind package wrapping a header-only library
--make-nix Dump template default.nix and shell.nix files
rust-helper
Convenience tools for setting up Rust projects.
Usage (Auto-Generated)
usage: rust-helper [options]
Options:
dev Drop directly into a Rust development shell
nix Dump template shell.nix file
vscode DEFAULT[:OTHER:ENV] Generate VSCode settings file for rust-analyzer
(Run inside a Nix dev environment)
mp4unite
Unite mp4 files, much like with the pdfunite
tool.
usage: mp4unite [options] <MP4-sourcefile-1>..<MP4-sourcefile-n> <MP4-destfile>
Combine MP4 source files into a single destination MP4 file.
Options:
-h | --help Print out the help documentation.
-v | --verbose Print verbose output from ffmpeg
git-cc
Create a carbon copy of a Git repo, but with Git removed.
usage: git-cc repo_dir des_dir
Recursively backup a git repository (and its submodules) to a new, git-less source tree.
Effectively wraps up an arbitrarily complex git repo into a flat-packaged mass of code.
git-shortcuts
Git shortcut commands.
Usage (Auto-Generated)
usage: gitcm commit message ...
"Git CoMmit, push, and get head revision." No quotes needed for commit message.
usage: gitcop [-f] [branch_name]
"Git CheckOut and Pull." Assumes remote is named origin. Optional -f flag fetches first.
If no branch_name is provided, then only a pull will occur.
usage: githead [repo_path]
Gets the commit hash of the HEAD of the current (or specified) repo path.
setupws
Create standalone development workspaces.
Unlike with devshell's setupcurrentws
command, this tool takes all of its setup info from the CLI:
usage: setupws [OPTIONS] workspace_name srcname:git_url [srcname:git_url ...] [scriptname=scriptpath ...]
Create a development workspace with specified git sources and scripts.
Options:
--dev_dir [DIRNAME] Specify the root directory where the [workspace_name] source
directory will be created (default: ~/dev)
--data_dir [DIRNAME] Specify the root directory where the [workspace_name] mutable
data will be stored (default: ~/data)
listsources
Get the Git info about all sources in a devshell
workspace.
This command needs to be run with a devshell
workspace created with setupws
.
usage: listsources
List git information for all sources in a workspace. Must be run
within in a workspace created by setupws.
pkgshell
Flexible Nix shell.
usage: pkgshell [options] pkgs attr [--run CMD]
Make a nix shell with package [attr] from [pkgs] (e.g., '<nixpkgs>').
Optionally run a one-off command with --run CMD.
Special values for [pkgs]:
anixpkgs Fetch the latest anixpkgs from GitHub
Options:
-v|--verbose Print verbose output.
devshell
Developer tool for creating siloed dev environments.
A workspace has the directory tree structure:
[dev_dir]/[workspace_name]
: Workspace root.data/
: Directory for storing long-lived workspace data, symlinked to[data_dir]/[workspace_name]
..envrc
:direnv
environment file defining important worksapce aliases.shell.nix
: Workspace shell file forlorri
integrations.sources/
: Directory containing all workspace source repositories.
The dev/
directory can be deleted and re-constructed as needed, whereas the data/
directory holds stuff that's meant to last.
Once in the shell, the following commands are provided:
setupcurrentws
: A wrapped version of setupws that will build your development workspace as specified in~/.devrc
.godev
: An alias that will take you to the root of your development workspace.listsources
: See the listsources tool documentation.dev
: Enter an interactive menu for workspace source manipulation.
Usage (Auto-Generated)
usage: devshell [-n|--new] [-d DEVRC] [-s DEVHIST] [--override-data-dir DIR] [--run CMD] workspace_name
Enter [workspace_name]'s development shell as defined in ~/.devrc
(can specify an alternate path with -d DEVRC or history file with
-s DEVHIST).
Add a new workspace with the -n|--new flag.
Optionally run a one-off command with --run CMD (e.g., --run dev).
Example ~/.devrc:
=================================================================
dev_dir = ~/dev
data_dir = ~/data
pkgs_dir = ~/sources/anixpkgs
pkgs_var = <anixpkgs>
# repositories
[manif-geom-cpp] = pkgs manif-geom-cpp
[geometry] = pkgs python3.pkgs.geometry
[pyvitools] = git@github.com:goromal/pyvitools.git
[scrape] = git@github.com:goromal/scrape.git
# scripts
<script_ref> = data_dir_relative_path/script
# workspaces
signals = manif-geom-cpp geometry pyvitools script_ref
=================================================================
providence
Be randomly dictated to from passages of importance.
Requires a wiki secrets file at ~/secrets/wiki/secrets.json
.
Usage (Auto-Generated)
usage: providence [options] domain
Pick randomly from a specified domain:
- patriarchal
- passage
Options:
--wiki-url URL URL of wiki to get data from (default: https://notes.andrewtorgesen.com)
providence-tasker
Providence + Google Tasks integration.
Takes output from providence
and places it into [num_days]
consecutive days of Google Tasks.
Requires a wiki secrets file at ~/secrets/wiki/secrets.json
and a Google Tasks secrets file
at ~/secrets/task/secrets.json
.
Usage (Auto-Generated)
usage: providence-tasker [options] num_days
Generate [num_days] tasks derived from providence output.
Options:
--wiki-url URL URL of wiki to get data from (default: https://notes.andrewtorgesen.com)
fixfname
Unix-ify filenames.
usage: fixfname FILE
Replace spaces and remove [], () characters from a filename (in place).
nix-deps
Recurse the dependencies of a Nix package.
Usage (Auto-Generated)
usage: nix-deps derivation
OR e.g.,
nix-deps '<nixpkgs>' -A pkgname
Recurse the dependencies of a Nix package.
nix-diffs
Diff the Nix hashes of two closures or packages in nicely-formatted diffed text files (with ordering mostly preserved).
Usage (Auto-Generated)
usage: nix-diffs derivation1 derivation2 out_dir
Diff the Nix hashes of two closures or packages in nicely-formatted diffed text files (with ordering mostly preserved).
anix-version
Get the current anixpkgs version of the operating system.
Usage (Auto-Generated)
usage: anix-version
Get the current anixpkgs version of the operating system.
anix-upgrade
Upgrade the operating system and view the delta. [NIXOS VERSION]
Usage (Auto-Generated)
usage: anix-upgrade [-v|--version VERSION;-c|--commit COMMIT;-b|--branch BRANCH;-s|--source SOURCETREE] [--local] [--boot]
Upgrade the operating system and view the delta. [NIXOS VERSION]
For this to work properly, your configuration.nix file or home.nix file must be symlinked to some configuration file in ~/sources/anixpkgs (which is generated by this program).
anix-changelog-compare
Compare the changelogs of two instances of anixpkgs.
Usage (Auto-Generated)
usage: anix-changelog-compare anixpkgs_FROM anixpkgs_TO
Compare the changelogs of two instances of anixpkgs.
flake-update
Automatically update the flake lock of every changed ref (according to Git diff).
Usage (Auto-Generated)
usage: flake-update [path/to/flake.nix]
Automatically update the flake lock of every changed ref (according to Git diff).
rcrsync
Cloud directory management tool.
Usage (Auto-Generated)
usage: rcrsync [OPTS] [init|sync|copy|override] CLOUD_DIR
Manage cloud directories with rclone.
Options:
-v|--verbose Print verbose output
CLOUD_DIR options:
Java Packages
(Toy) Packages written in Java.
evil-hangman
Interactive hangman game where you'll probably lose (because the computer is cheating).
Written in Java. Repository
usage: evil-hangman <word-length> <num-guesses>
spelling-corrector
Offer up spelling corrections for provided mispelled words.
Written in Java. Repository
The repository contains the text-file dictionary from which all word suggestions derive.
usage: spelling-corrector <word>
simple-image-editor
Perform some simple image transformations from the command line.
Written in Java. Repository
Input and output images must be in the PPM image format.
usage: simple-image-editor <input.ppm> <output.ppm> [grayscale|invert|emboss|motionblur motion-blur-length]