Compare commits
51 Commits
v2025.0629
...
v2025.0720
Author | SHA1 | Date | |
---|---|---|---|
6a3ca6bc10 | |||
7f8312ed59 | |||
1b03087c02 | |||
0ba6227412 | |||
f5ba2e719b | |||
73c94f34f6 | |||
af4cbbcab0 | |||
a415eb0f91 | |||
83d6cf1603 | |||
fbaa3a4089 | |||
0c767e065c | |||
f7d2001871 | |||
d13011a329 | |||
d27904ec05 | |||
decf16da7f | |||
aa04f5e71e | |||
17224c4637 | |||
4badce0ed4 | |||
fe83fc3d64 | |||
7149b8714e | |||
af95d27964 | |||
3eb78acf70 | |||
3d21d1da7d | |||
344d62034c | |||
78e41214d7 | |||
512ba200c2 | |||
5f04bd23a1 | |||
67bb7f747f | |||
e55fe1a17c | |||
2f056b8500 | |||
fe3c5d2ad9 | |||
2ab38fd053 | |||
9dda4e1649 | |||
d8883c4419 | |||
4c4257eebe | |||
4bb85c63b8 | |||
e5f3569b2a | |||
de200a5bb6 | |||
0f1cfdcc28 | |||
7f937c1090 | |||
d7964d3a78 | |||
719475e29f | |||
70cb5c1b3a | |||
facc6b73b0 | |||
9a24576e37 | |||
3f68f44e3d | |||
dbe88a7121 | |||
00d1e86157 | |||
3388a46bf3 | |||
0f5421630a | |||
50fb5f9da6 |
@ -26,7 +26,10 @@ jobs:
|
||||
password: ${{ secrets.DOCKER_PUSH_TOKEN }}
|
||||
- name: Build Test Publish All
|
||||
run: |
|
||||
SOS_WRITE_TOKEN=${{ secrets.SOS_WRITE_TOKEN }} RELEASE_WRITE_TOKEN=${{ secrets.RELEASE_WRITE_TOKEN }} ./buildtestpublish_all.sh
|
||||
SOS_WRITE_TOKEN=${{ secrets.SOS_WRITE_TOKEN }} \
|
||||
RELEASE_WRITE_TOKEN=${{ secrets.RELEASE_WRITE_TOKEN }} \
|
||||
GITEA_CONTAINER_NAME=${{ env.JOB_CONTAINER_NAME }} \
|
||||
./buildtestpublish_all.sh
|
||||
|
||||
test-install-from-scratch:
|
||||
needs: [build]
|
||||
|
23
.kiro/steering/product.md
Normal file
23
.kiro/steering/product.md
Normal file
@ -0,0 +1,23 @@
|
||||
# Product Overview
|
||||
|
||||
This repository contains **getpkg** - a command-line package manager for the dropshell ecosystem, along with a collection of developer tools.
|
||||
|
||||
## Core Product
|
||||
- **getpkg**: Package manager that installs tools to `~/.getpkg/` with symlinks in `~/.local/bin/getpkg/`
|
||||
- Supports multiple architectures (x86_64, aarch64, universal)
|
||||
- Tools are published to and downloaded from `getpkg.xyz`
|
||||
|
||||
## Tool Collection
|
||||
The repository includes several utility tools:
|
||||
- **bb64**: Bash-compatible base64 encoder/decoder with custom character set
|
||||
- **dehydrate**: Converts files/directories to C++ source code for embedding
|
||||
- **whatsdirty**: Git repository status checker
|
||||
- **sos**: Simple object storage client
|
||||
- **gp**: Git push utility
|
||||
|
||||
## Key Features
|
||||
- Cross-platform tool distribution
|
||||
- Automated installation with PATH setup
|
||||
- Bash completion support
|
||||
- Architecture-aware downloads with fallbacks
|
||||
- Publishing system with authentication tokens
|
72
.kiro/steering/structure.md
Normal file
72
.kiro/steering/structure.md
Normal file
@ -0,0 +1,72 @@
|
||||
# Project Structure
|
||||
|
||||
## Repository Layout
|
||||
```
|
||||
├── buildtestpublish_all.sh # Master build script for all projects
|
||||
├── clean.sh # Global cleanup script
|
||||
├── README.md # Main project documentation
|
||||
└── <tool-name>/ # Individual tool directories
|
||||
```
|
||||
|
||||
## Tool Directory Structure
|
||||
|
||||
### C++ Projects (CMake-based)
|
||||
```
|
||||
<tool-name>/
|
||||
├── CMakeLists.txt # CMake configuration
|
||||
├── build.sh # Build script
|
||||
├── test.sh # Test script
|
||||
├── clean.sh # Cleanup script
|
||||
├── publish.sh # Publishing script
|
||||
├── install.sh # Installation script
|
||||
├── README.md # Tool documentation
|
||||
├── Dockerfile.dropshell-build # Docker build configuration
|
||||
├── src/ # Source code
|
||||
│ ├── <tool>.cpp # Main source file
|
||||
│ ├── version.hpp.in # Version template
|
||||
│ └── ... # Additional sources
|
||||
├── build/ # Build artifacts (generated)
|
||||
├── output/ # Final executables (generated)
|
||||
└── .vscode/ # VS Code configuration
|
||||
```
|
||||
|
||||
### Shell Script Projects
|
||||
```
|
||||
<tool-name>/
|
||||
├── <tool-name> # Executable shell script
|
||||
├── build.sh # Build script (may be no-op)
|
||||
├── test.sh # Test script
|
||||
├── clean.sh # Cleanup script
|
||||
├── publish.sh # Publishing script
|
||||
└── setup_script.sh # Post-install setup (optional)
|
||||
```
|
||||
|
||||
## Standard Files
|
||||
|
||||
### Required Scripts
|
||||
- **build.sh**: Builds the project (Docker for C++, no-op for shell)
|
||||
- **test.sh**: Runs project tests
|
||||
- **clean.sh**: Removes build artifacts
|
||||
- **publish.sh**: Publishes to getpkg.xyz registry
|
||||
|
||||
### Optional Files
|
||||
- **install.sh**: System-wide installation script
|
||||
- **setup_script.sh**: Post-install setup for getpkg
|
||||
- **cmake_prebuild.sh**: Pre-build setup for CMake projects
|
||||
|
||||
### Generated Directories
|
||||
- **build/**: CMake build artifacts (C++ projects)
|
||||
- **output/**: Final executables ready for distribution
|
||||
- **test_*/**: Test-specific directories
|
||||
|
||||
## Naming Conventions
|
||||
- Tool directories match executable names
|
||||
- C++ source files typically match project name
|
||||
- Version templates use `.hpp.in` extension
|
||||
- Docker files use `Dockerfile.dropshell-build` pattern
|
||||
- Test directories prefixed with `test_`
|
||||
|
||||
## Configuration Files
|
||||
- **.gitignore**: Standard ignore patterns for build artifacts
|
||||
- **.vscode/**: VS Code workspace settings
|
||||
- **CMakeLists.txt**: Follows standard template with PROJECT_NAME parameter
|
70
.kiro/steering/tech.md
Normal file
70
.kiro/steering/tech.md
Normal file
@ -0,0 +1,70 @@
|
||||
# Technology Stack
|
||||
|
||||
## Build System
|
||||
- **CMake 3.16+** with Ninja generator for C++ projects
|
||||
- **Docker** containerized builds using `gitea.jde.nz/public/dropshell-build-base:latest`
|
||||
- **Static linking** for all C++ executables (`-static` flag)
|
||||
|
||||
## Languages & Standards
|
||||
- **C++23** standard for all C++ projects
|
||||
- **Bash** for shell scripts and simple tools
|
||||
- **Shell scripts** follow `set -euo pipefail` pattern
|
||||
|
||||
## Dependencies
|
||||
- **nlohmann_json** for JSON handling in C++ projects
|
||||
- **CPR (static)** for HTTP requests in getpkg
|
||||
- Custom modules in `/usr/local/share/cmake/Modules`
|
||||
|
||||
## Common Build Patterns
|
||||
|
||||
### C++ Projects (CMake)
|
||||
```bash
|
||||
# Standard build command
|
||||
cmake -G Ninja -S . -B ./build -DCMAKE_BUILD_TYPE=Debug -DPROJECT_NAME=<project>
|
||||
cmake --build ./build
|
||||
```
|
||||
|
||||
### Docker Build (for C++ tools)
|
||||
```bash
|
||||
# Uses Dockerfile.dropshell-build pattern
|
||||
docker build -t <project>-build -f Dockerfile.dropshell-build --build-arg PROJECT=<project> --output ./output .
|
||||
```
|
||||
|
||||
### Shell Tools
|
||||
- No build step required
|
||||
- Executable shell scripts with proper shebang
|
||||
- Use `chmod +x` for permissions
|
||||
|
||||
## Common Commands
|
||||
|
||||
### Build
|
||||
```bash
|
||||
./build.sh # Build individual project
|
||||
./buildtestpublish_all.sh # Build all projects
|
||||
```
|
||||
|
||||
### Test
|
||||
```bash
|
||||
./test.sh # Run tests for individual project
|
||||
```
|
||||
|
||||
### Clean
|
||||
```bash
|
||||
./clean.sh # Clean build artifacts
|
||||
```
|
||||
|
||||
### Publish
|
||||
```bash
|
||||
./publish.sh # Publish to getpkg.xyz (requires SOS_WRITE_TOKEN)
|
||||
```
|
||||
|
||||
## Version Management
|
||||
- Automatic timestamp-based versioning: `YYYY.MMDD.HHMM`
|
||||
- Version configured via `version.hpp.in` template files
|
||||
- Pre-build scripts (`cmake_prebuild.sh`) for additional setup
|
||||
|
||||
## Environment Variables
|
||||
- `CMAKE_BUILD_TYPE`: Debug/Release (default: Debug)
|
||||
- `SOS_WRITE_TOKEN`: Authentication for publishing
|
||||
- `NO_CACHE`: Skip Docker cache when set to "true"
|
||||
- `PROJECT`: Project name for build scripts
|
278
README.md
278
README.md
@ -1,192 +1,86 @@
|
||||
# getpkg - Package Manager for Dropshell Tools
|
||||
|
||||
getpkg is a command-line package manager that simplifies tool installation, management, and publishing for the dropshell ecosystem. Tools are installed to `~/.getpkg/` with executable symlinks in `~/.local/bin/getpkg/` and automatically added to your PATH with bash completion.
|
||||
|
||||
## Installation
|
||||
|
||||
Install getpkg with a single command:
|
||||
|
||||
```bash
|
||||
curl https://getbin.xyz/getpkg-install | bash
|
||||
```
|
||||
|
||||
After installation, restart your shell or run `source ~/.bashrc` to enable the new PATH and completion settings.
|
||||
|
||||
## Basic Usage
|
||||
|
||||
### Installing Tools
|
||||
|
||||
Install any tool from the getpkg registry:
|
||||
|
||||
```bash
|
||||
# Install a tool
|
||||
getpkg install whatsdirty
|
||||
```
|
||||
|
||||
### Managing Installed Tools
|
||||
|
||||
```bash
|
||||
# List all available commands
|
||||
getpkg help
|
||||
|
||||
# Update all installed tools
|
||||
getpkg update
|
||||
|
||||
# Uninstall a tool
|
||||
getpkg uninstall whatsdirty
|
||||
|
||||
# Check getpkg version
|
||||
getpkg version
|
||||
```
|
||||
|
||||
## Available Commands
|
||||
|
||||
### Core Package Management
|
||||
|
||||
- **`getpkg install <tool_name>`** - Install or update a tool
|
||||
- **`getpkg uninstall <tool_name>`** - Remove an installed tool
|
||||
- **`getpkg update`** - Update getpkg and all installed tools
|
||||
|
||||
### Publishing (Requires SOS_WRITE_TOKEN)
|
||||
|
||||
- **`getpkg publish <tool_name[:ARCH]> <folder>`** - Upload a tool to getpkg.xyz
|
||||
- **`getpkg unpublish <tool_name[:ARCH]>`** - Remove a published tool
|
||||
- **`getpkg unpublish <hash>`** - Remove a published tool by hash
|
||||
|
||||
### Development Tools
|
||||
|
||||
- **`getpkg create <tool_name> <directory>`** - Create a new tool project
|
||||
- **`getpkg hash <file_or_directory>`** - Calculate hash of files/directories
|
||||
|
||||
### Information
|
||||
|
||||
- **`getpkg list`** - List all available packages with status
|
||||
- **`getpkg clean`** - Clean up orphaned configs and symlinks
|
||||
- **`getpkg version`** - Show getpkg version
|
||||
- **`getpkg help`** - Show detailed help
|
||||
- **`getpkg autocomplete`** - Show available commands for completion
|
||||
|
||||
## How It Works
|
||||
|
||||
### Installation Process
|
||||
|
||||
When you install a tool, getpkg:
|
||||
|
||||
1. **Downloads** the tool archive from getpkg.xyz
|
||||
2. **Extracts** it to `~/.getpkg/<tool_name>/`
|
||||
3. **Creates symlinks** for all executables in `~/.local/bin/getpkg/`
|
||||
4. **Ensures PATH** includes `~/.local/bin/getpkg` (one-time setup)
|
||||
5. **Enables completion** for the tool
|
||||
6. **Runs setup** if a `setup_script.sh` exists
|
||||
7. **Stores metadata** in `~/.config/getpkg/<tool_name>.json`
|
||||
|
||||
### Architecture Support
|
||||
|
||||
getpkg supports multiple architectures:
|
||||
- `x86_64` (Intel/AMD 64-bit)
|
||||
- `aarch64` (ARM 64-bit)
|
||||
- `universal` (cross-platform tools)
|
||||
|
||||
Tools are automatically downloaded for your architecture, with fallback to universal versions.
|
||||
|
||||
### File Locations
|
||||
|
||||
- **Tool files**: `~/.getpkg/<tool_name>/` (actual tool installation)
|
||||
- **Executable symlinks**: `~/.local/bin/getpkg/` (in your PATH)
|
||||
- **Configuration**: `~/.config/getpkg/`
|
||||
- **PATH setup**: `~/.bashrc_getpkg` (sourced by `~/.bashrc`)
|
||||
|
||||
## Examples
|
||||
|
||||
### Installing Popular Tools
|
||||
|
||||
```bash
|
||||
# Install available tools
|
||||
getpkg install dehydrate # File to C++ code generator
|
||||
getpkg install bb64 # Bash base64 encoder/decoder
|
||||
|
||||
# Development tools (for repository development)
|
||||
getpkg install whatsdirty # Check git repo status
|
||||
getpkg install sos # Simple object storage client
|
||||
getpkg install gp # Git push utility
|
||||
```
|
||||
|
||||
### Publishing Your Own Tools
|
||||
|
||||
```bash
|
||||
# Set your publishing token
|
||||
export SOS_WRITE_TOKEN="your-token-here"
|
||||
|
||||
# Create a new tool project
|
||||
getpkg create mytool ./mytool-project
|
||||
|
||||
# Publish architecture-specific build
|
||||
getpkg publish mytool:x86_64 ./build/
|
||||
|
||||
# Publish universal tool
|
||||
getpkg publish mytool ./build/
|
||||
|
||||
# Remove published tool
|
||||
getpkg unpublish mytool:x86_64
|
||||
```
|
||||
|
||||
### Development Workflow
|
||||
|
||||
```bash
|
||||
# Create tool structure
|
||||
getpkg create awesome-tool ./awesome-tool
|
||||
cd awesome-tool
|
||||
|
||||
# Build your tool...
|
||||
# Add executable to the directory
|
||||
|
||||
# Test locally
|
||||
./awesome-tool --version
|
||||
|
||||
# Publish when ready
|
||||
getpkg publish awesome-tool:x86_64 .
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
- **`SOS_WRITE_TOKEN`** - Authentication token for publishing tools
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Tool Not Found
|
||||
If a tool isn't found after installation, ensure your shell has loaded the new PATH:
|
||||
```bash
|
||||
source ~/.bashrc
|
||||
```
|
||||
|
||||
### Permission Issues
|
||||
getpkg installs to your home directory and doesn't require root access. If you encounter permission issues, check that `~/.local/bin/` is writable.
|
||||
|
||||
### Network Issues
|
||||
All tools are downloaded from `getpkg.xyz`. Ensure you have internet connectivity and the domain is accessible.
|
||||
|
||||
## Development
|
||||
|
||||
### Building getpkg
|
||||
|
||||
```bash
|
||||
# Build debug version
|
||||
cd getpkg && ./build.sh
|
||||
|
||||
# Run tests
|
||||
cd getpkg && ./test.sh
|
||||
|
||||
# Publish (requires SOS_WRITE_TOKEN)
|
||||
cd getpkg && ./publish.sh
|
||||
```
|
||||
|
||||
### Tool Development
|
||||
|
||||
When creating tools for getpkg:
|
||||
|
||||
1. Create a directory with your tool binary
|
||||
2. Optionally include a `setup_script.sh` for post-install setup
|
||||
3. The tool should support `version` and `autocomplete` subcommands
|
||||
4. Use `getpkg publish` to upload to the registry
|
||||
|
||||
For more details, see the development documentation in each tool's directory.
|
||||
# getpkg - Simple Package Manager
|
||||
|
||||
getpkg is a command-line package manager that makes it easy to install and manage developer tools. Tools are automatically installed to your home directory and added to your PATH.
|
||||
|
||||
## Quick Start
|
||||
|
||||
Install getpkg with one command:
|
||||
|
||||
```bash
|
||||
curl https://getbin.xyz/getpkg-install | bash
|
||||
```
|
||||
|
||||
After installation, restart your shell or run:
|
||||
```bash
|
||||
source ~/.bashrc
|
||||
```
|
||||
|
||||
## Basic Commands
|
||||
|
||||
### Install Tools
|
||||
```bash
|
||||
getpkg install <tool_name> # Install a tool
|
||||
getpkg list # See all available tools
|
||||
getpkg update # Update all installed tools
|
||||
```
|
||||
|
||||
### Manage Tools
|
||||
```bash
|
||||
getpkg uninstall <tool_name> # Remove a tool
|
||||
getpkg version # Check getpkg version
|
||||
getpkg help # Show all commands
|
||||
```
|
||||
|
||||
## Popular Tools
|
||||
|
||||
Install these useful developer tools:
|
||||
|
||||
```bash
|
||||
getpkg install bb64 # Bash-compatible base64 encoder/decoder
|
||||
getpkg install dehydrate # Convert files to C++ source code
|
||||
getpkg install whatsdirty # Check git repository status
|
||||
getpkg install sos # Simple object storage client
|
||||
getpkg install gp # Git push utility
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
When you install a tool:
|
||||
1. Downloads from getpkg.xyz
|
||||
2. Installs to `~/.getpkg/<tool_name>/`
|
||||
3. Creates shortcuts in `~/.local/bin/getpkg/`
|
||||
4. Adds to your PATH automatically
|
||||
5. Enables bash completion
|
||||
|
||||
## File Locations
|
||||
|
||||
- **Installed tools**: `~/.getpkg/<tool_name>/`
|
||||
- **Shortcuts**: `~/.local/bin/getpkg/` (in your PATH)
|
||||
- **Settings**: `~/.config/getpkg/`
|
||||
|
||||
## Architecture Support
|
||||
|
||||
getpkg automatically downloads the right version for your system:
|
||||
- Intel/AMD 64-bit (`x86_64`)
|
||||
- ARM 64-bit (`aarch64`)
|
||||
- Universal (works everywhere)
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Tool not found after install?**
|
||||
```bash
|
||||
source ~/.bashrc
|
||||
```
|
||||
|
||||
**Permission errors?**
|
||||
getpkg installs to your home directory - no root access needed.
|
||||
|
||||
**Network issues?**
|
||||
Check your internet connection to `getpkg.xyz`.
|
||||
|
||||
## Need Help?
|
||||
|
||||
```bash
|
||||
getpkg help # Show detailed help
|
||||
getpkg list # See what's available
|
||||
```
|
@ -13,7 +13,14 @@ mkdir -p "${SCRIPT_DIR}/output"
|
||||
# make sure we have the latest base image.
|
||||
docker pull gitea.jde.nz/public/dropshell-build-base:latest
|
||||
|
||||
# Build with or without cache based on NO_CACHE environment variable
|
||||
CACHE_FLAG=""
|
||||
if [ "${NO_CACHE:-false}" = "true" ]; then
|
||||
CACHE_FLAG="--no-cache"
|
||||
fi
|
||||
|
||||
docker build \
|
||||
${CACHE_FLAG} \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
|
24
bb64/clean.sh
Executable file
24
bb64/clean.sh
Executable file
@ -0,0 +1,24 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="bb64"
|
||||
|
||||
echo "Cleaning ${PROJECT}..."
|
||||
|
||||
# Remove output directory
|
||||
if [ -d "${SCRIPT_DIR}/output" ]; then
|
||||
echo "Removing output directory..."
|
||||
rm -rf "${SCRIPT_DIR}/output"
|
||||
fi
|
||||
|
||||
# Remove Docker images related to this project
|
||||
echo "Removing Docker images..."
|
||||
docker images --filter "reference=${PROJECT}-build*" -q | xargs -r docker rmi -f
|
||||
|
||||
# Remove Docker build cache
|
||||
echo "Pruning Docker build cache..."
|
||||
docker builder prune -f
|
||||
|
||||
echo "✓ ${PROJECT} cleaned successfully"
|
@ -20,7 +20,14 @@ echo "Building version $VERSION" >&2
|
||||
# build release version
|
||||
export CMAKE_BUILD_TYPE="Release"
|
||||
|
||||
# Build with or without cache based on NO_CACHE environment variable
|
||||
CACHE_FLAG=""
|
||||
if [ "${NO_CACHE:-false}" = "true" ]; then
|
||||
CACHE_FLAG="--no-cache"
|
||||
fi
|
||||
|
||||
docker build \
|
||||
${CACHE_FLAG} \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
@ -84,9 +91,11 @@ if git rev-parse "$TAG" >/dev/null 2>&1; then
|
||||
fi
|
||||
|
||||
# Check if tag exists on remote
|
||||
TAG_EXISTS_ON_REMOTE=false
|
||||
if git ls-remote --tags origin | grep -q "refs/tags/$TAG"; then
|
||||
echo "Tag $TAG already exists on remote - this is expected for multi-architecture builds"
|
||||
echo "Skipping tag creation and proceeding with release attachment..."
|
||||
TAG_EXISTS_ON_REMOTE=true
|
||||
else
|
||||
echo "Creating new tag $TAG..."
|
||||
git tag -a "$TAG" -m "Release $TAG"
|
||||
@ -105,12 +114,20 @@ echo "Getting or creating release $TAG on Gitea..."
|
||||
EXISTING_RELEASE=$(curl -s -X GET "$API_URL/releases/tags/$TAG" \
|
||||
-H "Authorization: token $RELEASE_WRITE_TOKEN")
|
||||
|
||||
echo "Existing release check response: $EXISTING_RELEASE" >&2
|
||||
|
||||
if echo "$EXISTING_RELEASE" | grep -q '"id":[0-9]*'; then
|
||||
# Release already exists, get its ID
|
||||
RELEASE_ID=$(echo "$EXISTING_RELEASE" | grep -o '"id":[0-9]*' | head -1 | cut -d: -f2)
|
||||
echo "Release $TAG already exists with ID: $RELEASE_ID"
|
||||
else
|
||||
# Create new release
|
||||
# Create new release only if tag was just created
|
||||
if [ "$TAG_EXISTS_ON_REMOTE" = true ]; then
|
||||
echo "Tag exists on remote but no release found - this shouldn't happen" >&2
|
||||
echo "API response was: $EXISTING_RELEASE" >&2
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Creating new release $TAG on Gitea..."
|
||||
RELEASE_RESPONSE=$(curl -s -X POST "$API_URL/releases" \
|
||||
-H "Content-Type: application/json" \
|
||||
|
@ -2,9 +2,6 @@
|
||||
set -uo pipefail # Remove -e to handle errors manually
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
|
||||
docker builder prune -f
|
||||
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
@ -237,15 +234,17 @@ function print_summary() {
|
||||
echo
|
||||
}
|
||||
|
||||
title "🔨 BUILDING ALL TOOLS 🔨"
|
||||
title "🔨 BUILDING GETPKG 🔨"
|
||||
|
||||
getpkg/build.sh
|
||||
"${SCRIPT_DIR}/getpkg/build.sh"
|
||||
export GETPKG="${SCRIPT_DIR}/getpkg/output/getpkg"
|
||||
if [ ! -f "$GETPKG" ]; then
|
||||
echo "Build failed."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
title "🔨 BUILDING ALL TOOLS 🔨"
|
||||
|
||||
buildtestpublish_all
|
||||
|
||||
print_summary
|
||||
|
44
clean.sh
Executable file
44
clean.sh
Executable file
@ -0,0 +1,44 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
|
||||
echo "🧹 CLEANING ALL PROJECTS 🧹"
|
||||
echo
|
||||
|
||||
# Get all project directories
|
||||
PROJECT_DIRS=$(find "$SCRIPT_DIR" -maxdepth 1 -type d \
|
||||
-not -name ".*" \
|
||||
-not -path "$SCRIPT_DIR" | sort)
|
||||
|
||||
for dir in $PROJECT_DIRS; do
|
||||
PROJECT_NAME=$(basename "$dir")
|
||||
|
||||
if [ -f "$dir/clean.sh" ]; then
|
||||
echo "Cleaning $PROJECT_NAME..."
|
||||
cd "$dir"
|
||||
./clean.sh
|
||||
echo
|
||||
else
|
||||
echo "⚠️ No clean.sh found for $PROJECT_NAME, skipping..."
|
||||
echo
|
||||
fi
|
||||
done
|
||||
|
||||
# Global Docker cleanup
|
||||
echo "🐳 Global Docker cleanup..."
|
||||
echo "Removing unused Docker images..."
|
||||
docker image prune -f
|
||||
|
||||
echo "Removing unused Docker containers..."
|
||||
docker container prune -f
|
||||
|
||||
echo "Removing unused Docker networks..."
|
||||
docker network prune -f
|
||||
|
||||
echo "Removing unused Docker volumes..."
|
||||
docker volume prune -f
|
||||
|
||||
echo
|
||||
echo "✅ All projects cleaned successfully!"
|
@ -1,65 +0,0 @@
|
||||
ARG IMAGE_TAG
|
||||
FROM gitea.jde.nz/public/dropshell-build-base:latest AS builder
|
||||
|
||||
ARG PROJECT
|
||||
ARG CMAKE_BUILD_TYPE=Debug
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
# Create cache directories
|
||||
RUN mkdir -p /ccache
|
||||
|
||||
# Set up ccache
|
||||
ENV CCACHE_DIR=/ccache
|
||||
ENV CCACHE_COMPILERCHECK=content
|
||||
ENV CCACHE_MAXSIZE=2G
|
||||
|
||||
# Copy build files
|
||||
COPY CMakeLists.txt ./
|
||||
COPY src/version.hpp.in src/
|
||||
|
||||
# Copy source files
|
||||
COPY src/ src/
|
||||
COPY contrib/ contrib/
|
||||
|
||||
# Configure project
|
||||
RUN --mount=type=cache,target=/ccache \
|
||||
--mount=type=cache,target=/build \
|
||||
mkdir -p /build && \
|
||||
cmake -G Ninja -S /app -B /build \
|
||||
-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE} \
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_EXE_LINKER_FLAGS="-fuse-ld=mold -static -g" \
|
||||
-DCMAKE_CXX_FLAGS="-g -fno-omit-frame-pointer" \
|
||||
-DCMAKE_C_FLAGS="-g -fno-omit-frame-pointer" \
|
||||
-DPROJECT_NAME="${PROJECT}" \
|
||||
-DCMAKE_STRIP=OFF \
|
||||
${CMAKE_TOOLCHAIN_FILE:+-DCMAKE_TOOLCHAIN_FILE=$CMAKE_TOOLCHAIN_FILE}
|
||||
|
||||
# Build project
|
||||
RUN --mount=type=cache,target=/ccache \
|
||||
--mount=type=cache,target=/build \
|
||||
cmake --build /build
|
||||
|
||||
# Copy the built executable to a regular directory for the final stage
|
||||
RUN --mount=type=cache,target=/build \
|
||||
mkdir -p /output && \
|
||||
find /build -type f -executable -name "*${PROJECT}*" -exec cp {} /output/${PROJECT} \; || \
|
||||
find /build -type f -executable -exec cp {} /output/${PROJECT} \;
|
||||
|
||||
# if we're a release build, then run upx on the binary.
|
||||
RUN if [ "${CMAKE_BUILD_TYPE}" = "Release" ]; then \
|
||||
upx /output/${PROJECT}; \
|
||||
fi
|
||||
|
||||
# Final stage that only contains the binary
|
||||
FROM scratch AS project
|
||||
|
||||
ARG PROJECT
|
||||
|
||||
# Copy the actual binary from the regular directory
|
||||
COPY --from=builder /output/${PROJECT} /${PROJECT}
|
@ -1,22 +1,52 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Get script directory - handle different execution contexts
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="dehydrate"
|
||||
PROJECT="$(basename "${SCRIPT_DIR}")"
|
||||
|
||||
export CMAKE_BUILD_TYPE="Debug"
|
||||
# Debug output for CI
|
||||
echo "${PROJECT} build script running from: ${SCRIPT_DIR}"
|
||||
|
||||
rm -rf "${SCRIPT_DIR}/output"
|
||||
mkdir -p "${SCRIPT_DIR}/output"
|
||||
# handle running locally, or docker in docker via gitea runner.
|
||||
if [ -n "${GITEA_CONTAINER_NAME:-}" ]; then
|
||||
echo "We're in a gitea container: ${GITEA_CONTAINER_NAME}"
|
||||
VOLUME_OPTS=("--volumes-from=${GITEA_CONTAINER_NAME}")
|
||||
WORKING_DIR=("-w" "${GITHUB_WORKSPACE}/${PROJECT}")
|
||||
BUILD_DIR="${GITHUB_WORKSPACE}/${PROJECT}/build"
|
||||
OUTPUT_DIR="${GITHUB_WORKSPACE}/${PROJECT}/output"
|
||||
else
|
||||
VOLUME_OPTS=("-v" "${SCRIPT_DIR}:/app")
|
||||
WORKING_DIR=("-w" "/app")
|
||||
BUILD_DIR="${SCRIPT_DIR}/build"
|
||||
OUTPUT_DIR="${SCRIPT_DIR}/output"
|
||||
fi
|
||||
|
||||
# make sure we have the latest base image.
|
||||
docker pull gitea.jde.nz/public/dropshell-build-base:latest
|
||||
# Create output directory
|
||||
mkdir -p "${OUTPUT_DIR}"
|
||||
|
||||
docker build \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
--build-arg CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE}" \
|
||||
--output "${SCRIPT_DIR}/output" \
|
||||
"${SCRIPT_DIR}"
|
||||
# Run build in container with mounted directories
|
||||
COMMAND_TO_RUN="cmake -G Ninja -S . -B ./build \
|
||||
-DCMAKE_BUILD_TYPE=\${CMAKE_BUILD_TYPE} \
|
||||
-DPROJECT_NAME=${PROJECT} && \
|
||||
cmake --build ./build"
|
||||
|
||||
echo "Building in new docker container"
|
||||
docker run --rm \
|
||||
--user "$(id -u):$(id -g)" \
|
||||
"${VOLUME_OPTS[@]}" \
|
||||
"${WORKING_DIR[@]}" \
|
||||
-e CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE:-Debug}" \
|
||||
gitea.jde.nz/public/dropshell-build-base:latest \
|
||||
bash -c "${COMMAND_TO_RUN}"
|
||||
|
||||
# Copy built executable to output directory
|
||||
if [ -f "${BUILD_DIR}/${PROJECT}" ]; then
|
||||
cp "${BUILD_DIR}/${PROJECT}" "${OUTPUT_DIR}/"
|
||||
echo "✓ Build successful - ${PROJECT} copied to ${OUTPUT_DIR}/"
|
||||
else
|
||||
echo "✗ Build failed - ${PROJECT} not found in ${BUILD_DIR}/"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Build complete"
|
||||
|
18
dehydrate/clean.sh
Executable file
18
dehydrate/clean.sh
Executable file
@ -0,0 +1,18 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="$(basename "$(dirname "${SCRIPT_DIR}")")"
|
||||
|
||||
echo "Cleaning ${PROJECT}..."
|
||||
|
||||
# Remove output and build directories
|
||||
for dir in "output" "build"; do
|
||||
if [ -d "${SCRIPT_DIR}/${dir}" ]; then
|
||||
echo "Removing ${dir} directory..."
|
||||
rm -rf "${SCRIPT_DIR:?}/${dir}"
|
||||
fi
|
||||
done
|
||||
|
||||
echo "✓ ${PROJECT} cleaned successfully"
|
@ -35,14 +35,7 @@ heading "Building ${PROJECT}"
|
||||
|
||||
# build release version
|
||||
export CMAKE_BUILD_TYPE="Release"
|
||||
|
||||
docker build \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
--build-arg CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE}" \
|
||||
--output "${OUTPUT}" \
|
||||
"${SCRIPT_DIR}"
|
||||
"${SCRIPT_DIR}/build.sh"
|
||||
|
||||
[ -f "${OUTPUT}/${PROJECT}" ] || die "Build failed."
|
||||
|
||||
|
@ -4,8 +4,20 @@ set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
PROJECT="dehydrate"
|
||||
DEHYDRATE="${SCRIPT_DIR}/output/${PROJECT}"
|
||||
TEST_DIR="${SCRIPT_DIR}/test_temp"
|
||||
|
||||
# Handle running locally or in Gitea runner
|
||||
if [ -n "${GITEA_CONTAINER_NAME:-}" ]; then
|
||||
echo "Running in Gitea CI environment"
|
||||
echo "GITHUB_WORKSPACE: ${GITHUB_WORKSPACE}"
|
||||
echo "Current directory: $(pwd)"
|
||||
OUTPUT_DIR="${GITHUB_WORKSPACE}/dehydrate/output"
|
||||
TEST_DIR="${GITHUB_WORKSPACE}/dehydrate/test_temp"
|
||||
else
|
||||
OUTPUT_DIR="${SCRIPT_DIR}/output"
|
||||
TEST_DIR="${SCRIPT_DIR}/test_temp"
|
||||
fi
|
||||
|
||||
DEHYDRATE="${OUTPUT_DIR}/${PROJECT}"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
@ -45,10 +57,31 @@ mkdir -p "$TEST_DIR"
|
||||
|
||||
echo -e "${YELLOW}Running dehydrate tests...${NC}\n"
|
||||
|
||||
# Debug output
|
||||
echo "Looking for dehydrate at: $DEHYDRATE"
|
||||
echo "Workspace structure:"
|
||||
ls -la "${GITHUB_WORKSPACE}" 2>/dev/null || echo "Workspace not found"
|
||||
echo "Dehydrate directory contents:"
|
||||
ls -la "${GITHUB_WORKSPACE}/dehydrate" 2>/dev/null || echo "Dehydrate directory not found"
|
||||
echo "Output directory contents:"
|
||||
ls -la "$OUTPUT_DIR" 2>/dev/null || echo "Output directory not found"
|
||||
|
||||
# Check if dehydrate binary exists
|
||||
if [ ! -f "$DEHYDRATE" ]; then
|
||||
echo -e "${RED}Error: dehydrate binary not found at $DEHYDRATE${NC}"
|
||||
echo "Please run ./build.sh first to build dehydrate"
|
||||
|
||||
if [ -n "${GITEA_CONTAINER_NAME:-}" ]; then
|
||||
echo "Checking if build directory exists..."
|
||||
BUILD_DIR="${GITHUB_WORKSPACE}/dehydrate/build"
|
||||
if [ -d "$BUILD_DIR" ]; then
|
||||
echo "Build directory exists, checking contents:"
|
||||
ls -la "$BUILD_DIR"
|
||||
else
|
||||
echo "Build directory $BUILD_DIR does not exist"
|
||||
fi
|
||||
fi
|
||||
|
||||
exit 1
|
||||
fi
|
||||
|
||||
|
@ -4,7 +4,7 @@
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
|
||||
PROJECT_DIR="$( cd "$SCRIPT_DIR/.." && pwd )"
|
||||
|
||||
cd "$SCRIPT_DIR"
|
||||
cd "$SCRIPT_DIR" || exit 1
|
||||
|
||||
# Clean up old test data and any existing binaries
|
||||
# Force removal with chmod to handle permission issues
|
||||
|
@ -1,83 +0,0 @@
|
||||
ARG IMAGE_TAG
|
||||
FROM gitea.jde.nz/public/dropshell-build-base:latest AS builder
|
||||
|
||||
ARG PROJECT
|
||||
ARG CMAKE_BUILD_TYPE=Debug
|
||||
|
||||
# Set working directory
|
||||
WORKDIR /app
|
||||
|
||||
SHELL ["/bin/bash", "-c"]
|
||||
|
||||
# Create cache directories
|
||||
RUN mkdir -p /ccache
|
||||
|
||||
# Set up ccache
|
||||
ENV CCACHE_DIR=/ccache
|
||||
ENV CCACHE_COMPILERCHECK=content
|
||||
ENV CCACHE_MAXSIZE=2G
|
||||
|
||||
# Copy only build files first (for better layer caching)
|
||||
COPY CMakeLists.txt cmake_prebuild.sh ./
|
||||
COPY src/version.hpp.in src/
|
||||
|
||||
# Run prebuild script early (this rarely changes)
|
||||
RUN bash cmake_prebuild.sh
|
||||
|
||||
# Copy source files (this invalidates cache when source changes)
|
||||
COPY src/ src/
|
||||
|
||||
# Configure project (this step is cached unless CMakeLists.txt changes)
|
||||
RUN --mount=type=cache,target=/ccache \
|
||||
--mount=type=cache,target=/build \
|
||||
mkdir -p /build && \
|
||||
SSL_LIB=$(find /usr/local -name "libssl.a" | head -1) && \
|
||||
CRYPTO_LIB=$(find /usr/local -name "libcrypto.a" | head -1) && \
|
||||
echo "Found SSL: $SSL_LIB, Crypto: $CRYPTO_LIB" && \
|
||||
cmake -G Ninja -S /app -B /build \
|
||||
-DCMAKE_BUILD_TYPE=${CMAKE_BUILD_TYPE} \
|
||||
-DCMAKE_CXX_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_C_COMPILER_LAUNCHER=ccache \
|
||||
-DCMAKE_EXE_LINKER_FLAGS="-fuse-ld=mold -static -g" \
|
||||
-DCMAKE_CXX_FLAGS="-g -fno-omit-frame-pointer" \
|
||||
-DCMAKE_C_FLAGS="-g -fno-omit-frame-pointer" \
|
||||
-DPROJECT_NAME="${PROJECT}" \
|
||||
-DCMAKE_STRIP=OFF \
|
||||
-DOPENSSL_SSL_LIBRARY="$SSL_LIB" \
|
||||
-DOPENSSL_CRYPTO_LIBRARY="$CRYPTO_LIB" \
|
||||
-DOPENSSL_INCLUDE_DIR=/usr/local/include \
|
||||
${CMAKE_TOOLCHAIN_FILE:+-DCMAKE_TOOLCHAIN_FILE=$CMAKE_TOOLCHAIN_FILE}
|
||||
|
||||
# Run prebuild script
|
||||
RUN --mount=type=cache,target=/ccache \
|
||||
--mount=type=cache,target=/build \
|
||||
cmake --build /build --target run_prebuild_script
|
||||
|
||||
# Build project (ccache will help here when only some files change)
|
||||
RUN --mount=type=cache,target=/ccache \
|
||||
--mount=type=cache,target=/build \
|
||||
cmake --build /build
|
||||
|
||||
# Copy the built executable to a regular directory for the final stage
|
||||
RUN --mount=type=cache,target=/build \
|
||||
mkdir -p /output && \
|
||||
find /build -type f -executable -name "*${PROJECT}*" -exec cp {} /output/${PROJECT} \; || \
|
||||
find /build -type f -executable -exec cp {} /output/${PROJECT} \;
|
||||
|
||||
|
||||
# if we're a release build, then run upx on the binary.
|
||||
RUN if [ "${CMAKE_BUILD_TYPE}" = "Release" ]; then \
|
||||
upx /output/${PROJECT}; \
|
||||
fi
|
||||
|
||||
# Final stage that only contains the binary
|
||||
FROM scratch AS project
|
||||
|
||||
ARG PROJECT
|
||||
|
||||
# Copy CA certificates for SSL validation
|
||||
#COPY --from=builder /etc/ssl/certs/ /etc/ssl/certs/
|
||||
|
||||
# Copy the actual binary from the regular directory
|
||||
COPY --from=builder /output/${PROJECT} /${PROJECT}
|
||||
|
207
getpkg/README.md
Normal file
207
getpkg/README.md
Normal file
@ -0,0 +1,207 @@
|
||||
# getpkg - Package Manager for Dropshell Tools
|
||||
|
||||
getpkg is a command-line package manager that simplifies tool installation, management, and publishing for the dropshell ecosystem. Tools are installed to `~/.getpkg/` with executable symlinks in `~/.local/bin/getpkg/` and automatically added to your PATH with bash completion.
|
||||
|
||||
## Installation
|
||||
|
||||
Install getpkg with a single command:
|
||||
|
||||
```bash
|
||||
curl https://getbin.xyz/getpkg-install | bash
|
||||
```
|
||||
|
||||
After installation, restart your shell or run `source ~/.bashrc` to enable the new PATH and completion settings.
|
||||
|
||||
## Basic Usage
|
||||
|
||||
### Installing Tools
|
||||
|
||||
Install any tool from the getpkg registry:
|
||||
|
||||
```bash
|
||||
# Install a tool
|
||||
getpkg install whatsdirty
|
||||
```
|
||||
|
||||
### Managing Installed Tools
|
||||
|
||||
```bash
|
||||
# List all available commands
|
||||
getpkg help
|
||||
|
||||
# Update all installed tools
|
||||
getpkg update
|
||||
|
||||
# Uninstall a tool
|
||||
getpkg uninstall whatsdirty
|
||||
|
||||
# Check getpkg version
|
||||
getpkg version
|
||||
```
|
||||
|
||||
## Available Commands
|
||||
|
||||
### Core Package Management
|
||||
|
||||
- **`getpkg install <tool_name>`** - Install or update a tool
|
||||
- **`getpkg uninstall <tool_name>`** - Remove an installed tool
|
||||
- **`getpkg update`** - Update getpkg and all installed tools
|
||||
|
||||
### Publishing (Requires SOS_WRITE_TOKEN)
|
||||
|
||||
- **`getpkg publish <tool_name[:ARCH]> <folder>`** - Upload a tool to getpkg.xyz
|
||||
- **`getpkg unpublish <tool_name[:ARCH]>`** - Remove a published tool
|
||||
- **`getpkg unpublish <hash>`** - Remove a published tool by hash
|
||||
|
||||
### Development Tools
|
||||
|
||||
- **`getpkg create <tool_name> <directory>`** - Create a new tool project
|
||||
- **`getpkg hash <file_or_directory>`** - Calculate hash of files/directories
|
||||
|
||||
### Information
|
||||
|
||||
- **`getpkg list`** - List all available packages with status
|
||||
- **`getpkg clean`** - Clean up orphaned configs and symlinks
|
||||
- **`getpkg version`** - Show getpkg version
|
||||
- **`getpkg help`** - Show detailed help
|
||||
- **`getpkg autocomplete`** - Show available commands for completion
|
||||
|
||||
## How It Works
|
||||
|
||||
### Installation Process
|
||||
|
||||
When you install a tool, getpkg:
|
||||
|
||||
1. **Downloads** the tool archive from getpkg.xyz
|
||||
2. **Extracts** it to `~/.getpkg/<tool_name>/`
|
||||
3. **Creates symlinks** for all executables in `~/.local/bin/getpkg/`
|
||||
4. **Ensures PATH** includes `~/.local/bin/getpkg` (one-time setup)
|
||||
5. **Enables bash completion** for the tool
|
||||
6. **Runs setup** if a `setup_script.sh` exists
|
||||
7. **Stores metadata** in `~/.config/getpkg/<tool_name>.json`
|
||||
|
||||
### Architecture Support
|
||||
|
||||
getpkg supports multiple architectures:
|
||||
- `x86_64` (Intel/AMD 64-bit)
|
||||
- `aarch64` (ARM 64-bit)
|
||||
- `universal` (cross-platform tools)
|
||||
|
||||
Tools are automatically downloaded for your architecture, with fallback to universal versions.
|
||||
|
||||
### File Locations
|
||||
|
||||
- **Tool files**: `~/.getpkg/<tool_name>/` (actual tool installation)
|
||||
- **Executable symlinks**: `~/.local/bin/getpkg/` (in your PATH)
|
||||
- **Configuration**: `~/.config/getpkg/`
|
||||
- **PATH setup**: `~/.bashrc_getpkg` (sourced by `~/.bashrc`)
|
||||
|
||||
## Examples
|
||||
|
||||
### Installing Popular Tools
|
||||
|
||||
```bash
|
||||
# Install available tools
|
||||
getpkg install dehydrate # File to C++ code generator
|
||||
getpkg install bb64 # Bash base64 encoder/decoder
|
||||
|
||||
# Development tools (for repository development)
|
||||
getpkg install whatsdirty # Check git repo status
|
||||
getpkg install sos # Simple object storage client
|
||||
getpkg install gp # Git push utility
|
||||
```
|
||||
|
||||
### Publishing Your Own Tools
|
||||
|
||||
```bash
|
||||
# Set your publishing token
|
||||
export SOS_WRITE_TOKEN="your-token-here"
|
||||
|
||||
# Create a new tool project
|
||||
getpkg create mytool ./mytool-project
|
||||
|
||||
# Publish architecture-specific build
|
||||
getpkg publish mytool:x86_64 ./build/
|
||||
|
||||
# Publish universal tool
|
||||
getpkg publish mytool ./build/
|
||||
|
||||
# Remove published tool
|
||||
getpkg unpublish mytool:x86_64
|
||||
```
|
||||
|
||||
### Development Workflow
|
||||
|
||||
```bash
|
||||
# Create tool structure
|
||||
getpkg create awesome-tool ./awesome-tool
|
||||
cd awesome-tool
|
||||
|
||||
# Build your tool...
|
||||
# Add executable to the directory
|
||||
|
||||
# Test locally
|
||||
./awesome-tool --version
|
||||
|
||||
# Publish when ready
|
||||
getpkg publish awesome-tool:x86_64 .
|
||||
```
|
||||
|
||||
## Environment Variables
|
||||
|
||||
- **`SOS_WRITE_TOKEN`** - Authentication token for publishing tools
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Tool Not Found
|
||||
If a tool isn't found after installation, ensure your shell has loaded the new PATH:
|
||||
```bash
|
||||
source ~/.bashrc
|
||||
```
|
||||
|
||||
### Permission Issues
|
||||
getpkg installs to your home directory and doesn't require root access. If you encounter permission issues, check that `~/.local/bin/` is writable.
|
||||
|
||||
### Network Issues
|
||||
All tools are downloaded from `getpkg.xyz`. Ensure you have internet connectivity and the domain is accessible.
|
||||
|
||||
## Development
|
||||
|
||||
### Building getpkg
|
||||
|
||||
```bash
|
||||
# Build debug version
|
||||
cd getpkg && ./build.sh
|
||||
|
||||
# Run tests
|
||||
cd getpkg && ./test.sh
|
||||
|
||||
# Publish (requires SOS_WRITE_TOKEN)
|
||||
cd getpkg && ./publish.sh
|
||||
```
|
||||
|
||||
### Tool Development
|
||||
|
||||
When creating tools for getpkg:
|
||||
|
||||
1. Create a directory with your tool binary
|
||||
2. Optionally include a `setup_script.sh` for post-install setup
|
||||
3. The tool should support `version` and `autocomplete` subcommands
|
||||
4. Use `getpkg publish` to upload to the registry
|
||||
|
||||
### Testing
|
||||
|
||||
The test script creates all temporary files and directories in `test_temp/` to keep the main directory clean:
|
||||
|
||||
```bash
|
||||
# Run tests
|
||||
./test.sh
|
||||
|
||||
# Clean up orphaned test files from old test runs (one-time)
|
||||
bash cleanup_old_test_files.sh
|
||||
|
||||
# Clean up orphaned test packages from getpkg.xyz
|
||||
bash cleanup_test_packages.sh
|
||||
```
|
||||
|
||||
For more details, see the development documentation in each tool's directory.
|
@ -1,25 +1,52 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
# Get script directory - handle different execution contexts
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="$(basename "${SCRIPT_DIR}")"
|
||||
|
||||
# Debug output for CI
|
||||
echo "${PROJECT} build script running from: ${SCRIPT_DIR}"
|
||||
|
||||
export CMAKE_BUILD_TYPE="Debug"
|
||||
# handle running locally, or docker in docker via gitea runner.
|
||||
if [ -n "${GITEA_CONTAINER_NAME:-}" ]; then
|
||||
echo "We're in a gitea container: ${GITEA_CONTAINER_NAME}"
|
||||
VOLUME_OPTS=("--volumes-from=${GITEA_CONTAINER_NAME}")
|
||||
WORKING_DIR=("-w" "${GITHUB_WORKSPACE}/${PROJECT}")
|
||||
BUILD_DIR="${GITHUB_WORKSPACE}/${PROJECT}/build"
|
||||
OUTPUT_DIR="${GITHUB_WORKSPACE}/${PROJECT}/output"
|
||||
else
|
||||
VOLUME_OPTS=("-v" "${SCRIPT_DIR}:/app")
|
||||
WORKING_DIR=("-w" "/app")
|
||||
BUILD_DIR="${SCRIPT_DIR}/build"
|
||||
OUTPUT_DIR="${SCRIPT_DIR}/output"
|
||||
fi
|
||||
|
||||
rm -rf "${SCRIPT_DIR}/output"
|
||||
mkdir -p "${SCRIPT_DIR}/output"
|
||||
# Create output directory
|
||||
mkdir -p "${OUTPUT_DIR}"
|
||||
|
||||
PROJECT="getpkg"
|
||||
# Run build in container with mounted directories
|
||||
COMMAND_TO_RUN="cmake -G Ninja -S . -B ./build \
|
||||
-DCMAKE_BUILD_TYPE=\${CMAKE_BUILD_TYPE} \
|
||||
-DPROJECT_NAME=${PROJECT} && \
|
||||
cmake --build ./build"
|
||||
|
||||
# make sure we have the latest base image.
|
||||
docker pull gitea.jde.nz/public/dropshell-build-base:latest
|
||||
echo "Building in new docker container"
|
||||
docker run --rm \
|
||||
--user "$(id -u):$(id -g)" \
|
||||
"${VOLUME_OPTS[@]}" \
|
||||
"${WORKING_DIR[@]}" \
|
||||
-e CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE:-Debug}" \
|
||||
gitea.jde.nz/public/dropshell-build-base:latest \
|
||||
bash -c "${COMMAND_TO_RUN}"
|
||||
|
||||
docker build \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
--build-arg CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE}" \
|
||||
--output "${SCRIPT_DIR}/output" \
|
||||
"${SCRIPT_DIR}"
|
||||
# Copy built executable to output directory
|
||||
if [ -f "${BUILD_DIR}/${PROJECT}" ]; then
|
||||
cp "${BUILD_DIR}/${PROJECT}" "${OUTPUT_DIR}/"
|
||||
echo "✓ Build successful - ${PROJECT} copied to ${OUTPUT_DIR}/"
|
||||
else
|
||||
echo "✗ Build failed - ${PROJECT} not found in ${BUILD_DIR}/"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Build complete"
|
||||
|
18
getpkg/clean.sh
Executable file
18
getpkg/clean.sh
Executable file
@ -0,0 +1,18 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="$(basename "$(dirname "${SCRIPT_DIR}")")"
|
||||
|
||||
echo "Cleaning ${PROJECT}..."
|
||||
|
||||
# Remove output and build directories
|
||||
for dir in "output" "build"; do
|
||||
if [ -d "${SCRIPT_DIR}/${dir}" ]; then
|
||||
echo "Removing ${dir} directory..."
|
||||
rm -rf "${SCRIPT_DIR:?}/${dir}"
|
||||
fi
|
||||
done
|
||||
|
||||
echo "✓ ${PROJECT} cleaned successfully"
|
98
getpkg/cleanup_test_packages.sh
Executable file
98
getpkg/cleanup_test_packages.sh
Executable file
@ -0,0 +1,98 @@
|
||||
#!/bin/bash
|
||||
|
||||
# Cleanup script for orphaned test packages from getpkg testing
|
||||
# This script removes test packages that start with "test-" from getpkg.xyz
|
||||
# Run from the getpkg directory: bash cleanup_test_packages.sh
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
GETPKG="./output/getpkg"
|
||||
|
||||
# Colors for output
|
||||
RED='\033[0;31m'
|
||||
GREEN='\033[0;32m'
|
||||
YELLOW='\033[1;33m'
|
||||
NC='\033[0m' # No Color
|
||||
|
||||
echo -e "${YELLOW}Cleaning up orphaned test packages...${NC}"
|
||||
|
||||
# Check if getpkg binary exists
|
||||
if [ ! -f "$GETPKG" ]; then
|
||||
echo -e "${RED}Error: getpkg binary not found at $GETPKG${NC}"
|
||||
echo "Please run ./build.sh first to build getpkg"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check if SOS_WRITE_TOKEN is set
|
||||
if [ -z "${SOS_WRITE_TOKEN:-}" ]; then
|
||||
echo -e "${RED}Error: SOS_WRITE_TOKEN environment variable is not set${NC}"
|
||||
echo "This token is required to unpublish packages from getpkg.xyz"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Using getpkg binary: $GETPKG"
|
||||
echo "SOS_WRITE_TOKEN is set (${#SOS_WRITE_TOKEN} characters)"
|
||||
|
||||
# Get list of all packages from /dir endpoint
|
||||
echo "Fetching package list from getpkg.xyz/dir..."
|
||||
DIR_RESPONSE=$(curl -s "https://getpkg.xyz/dir" 2>/dev/null || echo "")
|
||||
|
||||
if [ -z "$DIR_RESPONSE" ]; then
|
||||
echo -e "${RED}Failed to fetch package list from server${NC}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Extract test package labeltags from JSON response
|
||||
# Try with jq first, fallback to grep/sed if jq is not available
|
||||
if command -v jq >/dev/null 2>&1; then
|
||||
TEST_PACKAGES=$(echo "$DIR_RESPONSE" | jq -r '.entries[]?.labeltags[]? // empty' 2>/dev/null | grep "^test-" | sort -u || echo "")
|
||||
else
|
||||
# Fallback: extract labeltags using grep and sed (less reliable but works without jq)
|
||||
TEST_PACKAGES=$(echo "$DIR_RESPONSE" | grep -o '"test-[^"]*"' | sed 's/"//g' | sort -u || echo "")
|
||||
fi
|
||||
|
||||
if [ -z "$TEST_PACKAGES" ]; then
|
||||
echo -e "${GREEN}No test packages found to clean up${NC}"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
echo -e "\n${YELLOW}Found test packages to clean up:${NC}"
|
||||
echo "$TEST_PACKAGES" | while read -r package; do
|
||||
echo " - $package"
|
||||
done
|
||||
|
||||
echo -e "\n${YELLOW}Cleaning up test packages...${NC}"
|
||||
|
||||
CLEANED_COUNT=0
|
||||
FAILED_COUNT=0
|
||||
|
||||
# Use process substitution to avoid subshell issues
|
||||
while IFS= read -r package; do
|
||||
if [ -n "$package" ]; then
|
||||
echo -n "Cleaning up $package... "
|
||||
|
||||
# Try to unpublish the package (temporarily disable set -e)
|
||||
set +e
|
||||
$GETPKG unpublish "$package" >/dev/null 2>&1
|
||||
UNPUBLISH_RESULT=$?
|
||||
set -e
|
||||
|
||||
if [ $UNPUBLISH_RESULT -eq 0 ]; then
|
||||
echo -e "${GREEN}OK${NC}"
|
||||
((CLEANED_COUNT++))
|
||||
else
|
||||
echo -e "${RED}FAILED${NC}"
|
||||
((FAILED_COUNT++))
|
||||
fi
|
||||
fi
|
||||
done <<< "$TEST_PACKAGES"
|
||||
|
||||
echo -e "\n${YELLOW}Cleanup Summary:${NC}"
|
||||
echo "Packages cleaned: $CLEANED_COUNT"
|
||||
echo "Failed cleanups: $FAILED_COUNT"
|
||||
|
||||
if [ $FAILED_COUNT -eq 0 ]; then
|
||||
echo -e "${GREEN}All test packages cleaned up successfully!${NC}"
|
||||
else
|
||||
echo -e "${YELLOW}Some packages failed to clean up. They may need manual removal.${NC}"
|
||||
fi
|
@ -1 +0,0 @@
|
||||
Debug content
|
@ -34,15 +34,7 @@ heading "Building ${PROJECT}"
|
||||
|
||||
# build release version
|
||||
export CMAKE_BUILD_TYPE="Release"
|
||||
|
||||
docker build \
|
||||
-t "${PROJECT}-build" \
|
||||
-f "${SCRIPT_DIR}/Dockerfile.dropshell-build" \
|
||||
--build-arg PROJECT="${PROJECT}" \
|
||||
--build-arg CMAKE_BUILD_TYPE="${CMAKE_BUILD_TYPE}" \
|
||||
--output "${OUTPUT}" \
|
||||
"${SCRIPT_DIR}"
|
||||
|
||||
"${SCRIPT_DIR}/build.sh"
|
||||
[ -f "${OUTPUT}/${PROJECT}" ] || die "Build failed."
|
||||
|
||||
#--------------------------------------------------------------------------------
|
||||
|
@ -5,6 +5,7 @@
|
||||
#include <iostream>
|
||||
#include <filesystem>
|
||||
#include <sstream>
|
||||
#include <set>
|
||||
#include <algorithm>
|
||||
|
||||
using json = nlohmann::json;
|
||||
@ -207,7 +208,7 @@ bool GetbinClient::deleteObject(const std::string& hash, const std::string& toke
|
||||
|
||||
bool GetbinClient::listPackages(std::vector<std::string>& outPackages) {
|
||||
try {
|
||||
std::string url = "https://" + SERVER_HOST + "/packages";
|
||||
std::string url = "https://" + SERVER_HOST + "/dir";
|
||||
|
||||
auto response = cpr::Get(cpr::Url{url},
|
||||
cpr::Header{{"User-Agent", getUserAgent()}},
|
||||
@ -217,20 +218,31 @@ bool GetbinClient::listPackages(std::vector<std::string>& outPackages) {
|
||||
if (response.status_code == 200) {
|
||||
try {
|
||||
auto resp_json = json::parse(response.text);
|
||||
if (resp_json.is_array()) {
|
||||
if (resp_json.contains("entries") && resp_json["entries"].is_array()) {
|
||||
outPackages.clear();
|
||||
for (const auto& item : resp_json) {
|
||||
if (item.is_string()) {
|
||||
outPackages.push_back(item.get<std::string>());
|
||||
std::set<std::string> uniqueTools;
|
||||
|
||||
for (const auto& entry : resp_json["entries"]) {
|
||||
if (entry.contains("labeltags") && entry["labeltags"].is_array()) {
|
||||
for (const auto& labeltag : entry["labeltags"]) {
|
||||
if (labeltag.is_string()) {
|
||||
std::string tag = labeltag.get<std::string>();
|
||||
// Extract tool name from "tool:arch" format
|
||||
size_t colonPos = tag.find(":");
|
||||
if (colonPos != std::string::npos) {
|
||||
std::string toolName = tag.substr(0, colonPos);
|
||||
if (!toolName.empty()) {
|
||||
uniqueTools.insert(toolName);
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
return true;
|
||||
} else if (resp_json.contains("packages") && resp_json["packages"].is_array()) {
|
||||
outPackages.clear();
|
||||
for (const auto& item : resp_json["packages"]) {
|
||||
if (item.is_string()) {
|
||||
outPackages.push_back(item.get<std::string>());
|
||||
}
|
||||
|
||||
// Convert set to vector
|
||||
for (const auto& tool : uniqueTools) {
|
||||
outPackages.push_back(tool);
|
||||
}
|
||||
return true;
|
||||
}
|
||||
@ -255,4 +267,51 @@ bool GetbinClient::listPackages(std::vector<std::string>& outPackages) {
|
||||
std::cerr << "[GetbinClient::listPackages] Exception: " << e.what() << std::endl;
|
||||
return false;
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
bool GetbinClient::listAllEntries(std::vector<std::pair<std::string, std::vector<std::string>>>& outEntries) {
|
||||
try {
|
||||
std::string url = "https://" + SERVER_HOST + "/dir";
|
||||
|
||||
auto response = cpr::Get(cpr::Url{url},
|
||||
cpr::Header{{"User-Agent", getUserAgent()}},
|
||||
cpr::Timeout{30000}, // 30 seconds
|
||||
cpr::VerifySsl{true});
|
||||
|
||||
if (response.status_code == 200) {
|
||||
try {
|
||||
auto resp_json = json::parse(response.text);
|
||||
if (resp_json.contains("entries") && resp_json["entries"].is_array()) {
|
||||
outEntries.clear();
|
||||
|
||||
for (const auto& entry : resp_json["entries"]) {
|
||||
if (entry.contains("hash") && entry.contains("labeltags") &&
|
||||
entry["hash"].is_string() && entry["labeltags"].is_array()) {
|
||||
|
||||
std::string hash = entry["hash"].get<std::string>();
|
||||
std::vector<std::string> labeltags;
|
||||
|
||||
for (const auto& tag : entry["labeltags"]) {
|
||||
if (tag.is_string()) {
|
||||
labeltags.push_back(tag.get<std::string>());
|
||||
}
|
||||
}
|
||||
|
||||
outEntries.push_back({hash, labeltags});
|
||||
}
|
||||
}
|
||||
return true;
|
||||
}
|
||||
} catch (const json::exception& e) {
|
||||
std::cerr << "[GetbinClient::listAllEntries] JSON parse error: " << e.what() << std::endl;
|
||||
}
|
||||
} else {
|
||||
std::cerr << "[GetbinClient::listAllEntries] HTTP " << response.status_code << ": " << response.error.message << std::endl;
|
||||
}
|
||||
|
||||
return false;
|
||||
} catch (const std::exception& e) {
|
||||
std::cerr << "[GetbinClient::listAllEntries] Exception: " << e.what() << std::endl;
|
||||
return false;
|
||||
}
|
||||
}
|
||||
|
@ -17,8 +17,9 @@ public:
|
||||
bool getHash(const std::string& toolName, const std::string& arch, std::string& outHash);
|
||||
bool deleteObject(const std::string& hash, const std::string& token);
|
||||
bool listPackages(std::vector<std::string>& outPackages);
|
||||
bool listAllEntries(std::vector<std::pair<std::string, std::vector<std::string>>>& outEntries);
|
||||
|
||||
private:
|
||||
static const std::string SERVER_HOST;
|
||||
std::string getUserAgent() const;
|
||||
};
|
||||
};
|
||||
|
@ -76,6 +76,17 @@
|
||||
namespace {
|
||||
using json = nlohmann::json;
|
||||
|
||||
// Clear current line and reset cursor to beginning
|
||||
void clearLine() {
|
||||
std::cout << "\r\033[K" << std::flush;
|
||||
}
|
||||
|
||||
// Clear current line and print message
|
||||
void clearAndPrint(const std::string& message) {
|
||||
clearLine();
|
||||
std::cout << message << std::flush;
|
||||
}
|
||||
|
||||
// Compare versions (returns true if v1 < v2)
|
||||
bool isVersionOlder(const std::string& v1, const std::string& v2) {
|
||||
// Simple version comparison - assumes versions are in YYYY.MMDD.HHMM format
|
||||
@ -215,14 +226,14 @@ int install_tool(int argc, char* argv[]) {
|
||||
std::cout << "Downloading " << toolName << "..." << std::flush;
|
||||
if (!getbin2.download(toolName, arch, archivePath.string(), progressCallback)) {
|
||||
// Try universal version as fallback
|
||||
std::cout << "\rArch-specific version not found, trying universal..." << std::endl;
|
||||
clearAndPrint("Arch-specific version not found, trying universal...\n");
|
||||
if (!getbin2.download(toolName, "universal", archivePath.string(), progressCallback)) {
|
||||
std::cerr << "\rFailed to download tool archive (tried both " << arch << " and universal)." << std::endl;
|
||||
return 1;
|
||||
}
|
||||
downloadArch = "universal";
|
||||
}
|
||||
std::cout << "\rDownloading " << toolName << "... done" << std::endl;
|
||||
clearAndPrint("Downloading " + toolName + "... done\n");
|
||||
|
||||
// Unpack tool
|
||||
std::cout << "Unpacking..." << std::flush;
|
||||
@ -230,13 +241,13 @@ int install_tool(int argc, char* argv[]) {
|
||||
std::cerr << "\rFailed to unpack tool archive." << std::endl;
|
||||
return 1;
|
||||
}
|
||||
std::cout << "\rUnpacking... done" << std::endl;
|
||||
clearAndPrint("Unpacking... done\n");
|
||||
|
||||
// Add to PATH and autocomplete
|
||||
std::cout << "Configuring..." << std::flush;
|
||||
scriptManager.addToolEntry(toolName, binDir.string());
|
||||
scriptManager.addAutocomplete(toolName);
|
||||
std::cout << "\rConfiguring... done" << std::endl;
|
||||
clearAndPrint("Configuring... done\n");
|
||||
|
||||
// Get tool info
|
||||
std::string hash;
|
||||
@ -347,7 +358,7 @@ int publish_tool(int argc, char* argv[]) {
|
||||
std::cerr << "\rFailed to upload archive." << std::endl;
|
||||
return 1;
|
||||
}
|
||||
std::cout << "\rUploading... done" << std::endl;
|
||||
clearAndPrint("Uploading... done\n");
|
||||
std::cout << "Published! URL: " << url << "\nHash: " << hash << std::endl;
|
||||
return 0;
|
||||
}
|
||||
@ -426,7 +437,7 @@ int update_tool(int argc, char* argv[]) {
|
||||
tool.status = "Check failed";
|
||||
}
|
||||
}
|
||||
std::cout << "\r" << std::string(50, ' ') << "\r" << std::flush; // Clear progress line
|
||||
clearLine(); // Clear progress line
|
||||
|
||||
// Step 2: Update tools that need updating
|
||||
std::vector<std::tuple<std::string, std::string, std::string>> updateResults;
|
||||
@ -484,7 +495,7 @@ int update_tool(int argc, char* argv[]) {
|
||||
|
||||
if (result == 0) {
|
||||
tool.status = "Updated";
|
||||
std::cout << " Updated" << std::endl;
|
||||
clearAndPrint("Updated\n");
|
||||
|
||||
// Re-read version after update
|
||||
std::filesystem::path toolInfoPath = configDir / (tool.name + ".json");
|
||||
@ -502,7 +513,7 @@ int update_tool(int argc, char* argv[]) {
|
||||
}
|
||||
} else {
|
||||
tool.status = "Failed";
|
||||
std::cout << " Failed" << std::endl;
|
||||
clearAndPrint("Failed\n");
|
||||
}
|
||||
}
|
||||
}
|
||||
@ -701,35 +712,34 @@ int unpublish_tool(int argc, char* argv[]) {
|
||||
return 1;
|
||||
}
|
||||
} else {
|
||||
// No specific architecture - unpublish all architectures
|
||||
std::vector<std::string> allArchitectures = {"x86_64", "aarch64", "universal"};
|
||||
std::vector<std::pair<std::string, std::string>> foundPackages;
|
||||
// No specific architecture - unpublish ALL entries with this tool name
|
||||
std::vector<std::pair<std::string, std::vector<std::string>>> allEntries;
|
||||
std::vector<std::pair<std::string, std::string>> foundPackages; // (tag, hash)
|
||||
|
||||
std::cout << "Searching for " << toolName << " across all architectures..." << std::endl;
|
||||
std::cout << "Searching for all entries with label '" << toolName << "'..." << std::endl;
|
||||
|
||||
// Find all existing versions
|
||||
for (const auto& arch : allArchitectures) {
|
||||
std::string archHash;
|
||||
if (getbin.getHash(toolName, arch, archHash) && !archHash.empty()) {
|
||||
// Validate hash
|
||||
bool validHash = true;
|
||||
for (char c : archHash) {
|
||||
if (!std::isdigit(c)) {
|
||||
validHash = false;
|
||||
break;
|
||||
}
|
||||
}
|
||||
|
||||
if (validHash) {
|
||||
foundPackages.push_back({arch, archHash});
|
||||
std::cout << " Found " << toolName << ":" << arch << " (hash: " << archHash << ")" << std::endl;
|
||||
if (!getbin.listAllEntries(allEntries)) {
|
||||
std::cerr << "Failed to get directory listing from server" << std::endl;
|
||||
return 1;
|
||||
}
|
||||
|
||||
// Find all entries with labeltags starting with toolName:
|
||||
for (const auto& entry : allEntries) {
|
||||
const std::string& hash = entry.first;
|
||||
const std::vector<std::string>& labeltags = entry.second;
|
||||
|
||||
for (const std::string& tag : labeltags) {
|
||||
if (tag.find(toolName + ":") == 0) {
|
||||
// Found a matching labeltag
|
||||
foundPackages.push_back({tag, hash});
|
||||
std::cout << " Found " << tag << " (hash: " << hash << ")" << std::endl;
|
||||
break; // Only count each hash once even if it has multiple matching tags
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if (foundPackages.empty()) {
|
||||
std::cerr << "No packages found for " << toolName << std::endl;
|
||||
std::cerr << "Searched architectures: x86_64, aarch64, universal" << std::endl;
|
||||
return 1;
|
||||
}
|
||||
|
||||
@ -741,7 +751,7 @@ int unpublish_tool(int argc, char* argv[]) {
|
||||
int failCount = 0;
|
||||
|
||||
for (const auto& [arch, archHash] : foundPackages) {
|
||||
std::cout << " Unpublishing " << toolName << ":" << arch << "... ";
|
||||
std::cout << " Unpublishing " << arch << "... ";
|
||||
if (getbin.deleteObject(archHash, token)) {
|
||||
std::cout << "OK" << std::endl;
|
||||
successCount++;
|
||||
@ -824,7 +834,7 @@ int list_packages(int argc, char* argv[]) {
|
||||
for (const auto& packageName : availablePackages) {
|
||||
std::string status = "Available";
|
||||
std::string localVersion = "-";
|
||||
std::string remoteStatus = "✓";
|
||||
std::string remoteStatus = "-";
|
||||
|
||||
auto it = installedPackages.find(packageName);
|
||||
if (it != installedPackages.end()) {
|
||||
@ -1140,6 +1150,85 @@ void show_help() {
|
||||
std::cout << " ~/.local/bin/getpkg/ Installed tool binaries" << std::endl;
|
||||
}
|
||||
|
||||
int autocomplete_command(int argc, char* argv[]) {
|
||||
std::vector<std::string> args(argv + 2, argv + argc);
|
||||
|
||||
// If no arguments, return all commands
|
||||
if (args.empty()) {
|
||||
std::cout << "install\n";
|
||||
std::cout << "uninstall\n";
|
||||
std::cout << "publish\n";
|
||||
std::cout << "unpublish\n";
|
||||
std::cout << "update\n";
|
||||
std::cout << "version\n";
|
||||
std::cout << "create\n";
|
||||
std::cout << "hash\n";
|
||||
std::cout << "list\n";
|
||||
std::cout << "clean\n";
|
||||
std::cout << "help\n";
|
||||
return 0;
|
||||
}
|
||||
|
||||
const std::string& subcommand = args[0];
|
||||
|
||||
// Handle autocompletion for specific commands
|
||||
if (subcommand == "install") {
|
||||
// For install, we could suggest popular packages or recently published ones
|
||||
// For now, just return empty (no specific completions)
|
||||
return 0;
|
||||
} else if (subcommand == "uninstall") {
|
||||
// For uninstall, list installed tools
|
||||
std::filesystem::path configDir = std::filesystem::path(std::getenv("HOME")) / ".config" / "getpkg";
|
||||
if (std::filesystem::exists(configDir)) {
|
||||
for (const auto& entry : std::filesystem::directory_iterator(configDir)) {
|
||||
if (entry.path().extension() == ".json") {
|
||||
std::string toolName = entry.path().stem().string();
|
||||
std::cout << toolName << "\n";
|
||||
}
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
} else if (subcommand == "publish") {
|
||||
// For publish, suggest architecture suffixes after tool name
|
||||
if (args.size() >= 2) {
|
||||
// If we have tool_name already, suggest architectures
|
||||
std::cout << "x86_64\n";
|
||||
std::cout << "aarch64\n";
|
||||
std::cout << "universal\n";
|
||||
}
|
||||
return 0;
|
||||
} else if (subcommand == "unpublish") {
|
||||
// For unpublish, list installed tools (similar to uninstall)
|
||||
std::filesystem::path configDir = std::filesystem::path(std::getenv("HOME")) / ".config" / "getpkg";
|
||||
if (std::filesystem::exists(configDir)) {
|
||||
for (const auto& entry : std::filesystem::directory_iterator(configDir)) {
|
||||
if (entry.path().extension() == ".json") {
|
||||
std::string toolName = entry.path().stem().string();
|
||||
std::cout << toolName << "\n";
|
||||
// Also suggest with architecture suffixes
|
||||
std::cout << toolName << ":x86_64\n";
|
||||
std::cout << toolName << ":aarch64\n";
|
||||
std::cout << toolName << ":universal\n";
|
||||
}
|
||||
}
|
||||
}
|
||||
return 0;
|
||||
} else if (subcommand == "create") {
|
||||
// For create, no specific completions (tool name and directory are user-defined)
|
||||
return 0;
|
||||
} else if (subcommand == "hash") {
|
||||
// For hash, suggest file extensions
|
||||
if (args.size() >= 2) {
|
||||
std::cout << "*.tgz\n";
|
||||
std::cout << "*.tar.gz\n";
|
||||
}
|
||||
return 0;
|
||||
}
|
||||
|
||||
// No specific completions for other commands
|
||||
return 0;
|
||||
}
|
||||
|
||||
} // end anonymous namespace
|
||||
|
||||
int main(int argc, char* argv[]) {
|
||||
@ -1159,19 +1248,7 @@ int main(int argc, char* argv[]) {
|
||||
} else if (command == "update") {
|
||||
return update_tool(argc, argv);
|
||||
} else if (command == "autocomplete") {
|
||||
std::vector<std::string> args(argv + 2, argv + argc);
|
||||
if (args.empty()) std::cout << R"(install
|
||||
uninstall
|
||||
publish
|
||||
unpublish
|
||||
update
|
||||
version
|
||||
create
|
||||
hash
|
||||
list
|
||||
clean
|
||||
help
|
||||
)";
|
||||
return autocomplete_command(argc, argv);
|
||||
} else if (command == "version") {
|
||||
std::cout << dropshell::VERSION << std::endl;
|
||||
} else if (command == "create") {
|
||||
|
@ -1 +0,0 @@
|
||||
test
|
@ -1,7 +0,0 @@
|
||||
#\!/bin/bash
|
||||
if [ "$1" = "version" ]; then
|
||||
echo "1.0.0"
|
||||
elif [ "$1" = "autocomplete" ]; then
|
||||
echo "help"
|
||||
echo "version"
|
||||
fi
|
@ -1,7 +0,0 @@
|
||||
#\!/bin/bash
|
||||
if [ "$1" = "version" ]; then
|
||||
echo "1.0.0"
|
||||
elif [ "$1" = "autocomplete" ]; then
|
||||
echo "help"
|
||||
echo "version"
|
||||
fi
|
149
getpkg/test.sh
149
getpkg/test.sh
@ -68,6 +68,28 @@ cleanup() {
|
||||
# Clean up noarch variant
|
||||
$GETPKG unpublish "${TEST_TOOL_NAME}-noarch:universal" 2>/dev/null || true
|
||||
|
||||
# Clean up any remaining test packages that start with "test-"
|
||||
echo "Cleaning up any remaining test packages..."
|
||||
DIR_RESPONSE=$(curl -s "https://getpkg.xyz/dir" 2>/dev/null || echo "")
|
||||
if [ -n "$DIR_RESPONSE" ]; then
|
||||
# Extract test package labeltags from JSON response
|
||||
if command -v jq >/dev/null 2>&1; then
|
||||
TEST_PACKAGES=$(echo "$DIR_RESPONSE" | jq -r '.entries[]?.labeltags[]? // empty' 2>/dev/null | grep "^test-" | sort -u || echo "")
|
||||
else
|
||||
# Fallback: extract labeltags using grep and sed
|
||||
TEST_PACKAGES=$(echo "$DIR_RESPONSE" | grep -o '"test-[^"]*"' | sed 's/"//g' | sort -u || echo "")
|
||||
fi
|
||||
|
||||
if [ -n "$TEST_PACKAGES" ]; then
|
||||
echo "$TEST_PACKAGES" | while read -r package; do
|
||||
if [ -n "$package" ]; then
|
||||
echo " Cleaning up orphaned test package: $package"
|
||||
$GETPKG unpublish "$package" 2>/dev/null || true
|
||||
fi
|
||||
done
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "Cleaned up test tools from getpkg.xyz"
|
||||
else
|
||||
echo "Note: SOS_WRITE_TOKEN not set, cannot clean up remote test objects"
|
||||
@ -455,12 +477,13 @@ EOF
|
||||
CONFIG_EXISTS=false
|
||||
TOOL_DIR_EXISTS=false
|
||||
SYMLINK_EXISTS=false
|
||||
HELPER_SYMLINK_EXISTS=false
|
||||
# HELPER_SYMLINK_EXISTS=false
|
||||
|
||||
[ -f ~/.config/getpkg/"${TEST_UNINSTALL_TOOL}.json" ] && CONFIG_EXISTS=true
|
||||
[ -d ~/.getpkg/"$TEST_UNINSTALL_TOOL" ] && TOOL_DIR_EXISTS=true
|
||||
[ -L ~/.local/bin/getpkg/"$TEST_UNINSTALL_TOOL" ] && SYMLINK_EXISTS=true
|
||||
[ -L ~/.local/bin/getpkg/"${TEST_UNINSTALL_TOOL}-helper" ] && HELPER_SYMLINK_EXISTS=true
|
||||
# Check if helper symlink exists (not currently used in validation)
|
||||
# [ -L ~/.local/bin/getpkg/"${TEST_UNINSTALL_TOOL}-helper" ] && HELPER_SYMLINK_EXISTS=true
|
||||
|
||||
if $CONFIG_EXISTS && $TOOL_DIR_EXISTS && $SYMLINK_EXISTS; then
|
||||
# Now uninstall
|
||||
@ -528,6 +551,128 @@ EOF
|
||||
fi
|
||||
fi
|
||||
|
||||
|
||||
# Test 13.5: Comprehensive unpublish functionality
|
||||
echo -e "\nTest 13.5: Comprehensive unpublish functionality"
|
||||
|
||||
# Only run unpublish tests if SOS_WRITE_TOKEN is available
|
||||
if [ -n "${SOS_WRITE_TOKEN:-}" ]; then
|
||||
# Create unique test names for unpublish tests
|
||||
UNPUBLISH_TOOL_BASE="test-unpublish-$RANDOM"
|
||||
UNPUBLISH_TOOL_MULTI="${UNPUBLISH_TOOL_BASE}-multi"
|
||||
UNPUBLISH_TOOL_CUSTOM="${UNPUBLISH_TOOL_BASE}-custom"
|
||||
UNPUBLISH_TEST_DIR="${TEST_DIR}/unpublish_tests"
|
||||
|
||||
# Create test directory structure
|
||||
mkdir -p "$UNPUBLISH_TEST_DIR"
|
||||
|
||||
# Test 13.5a: Create and publish tool with multiple architectures
|
||||
echo "Test 13.5a: Unpublish tool with multiple architectures"
|
||||
echo '#!/bin/bash
|
||||
echo "Multi-arch unpublish test"' > "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_MULTI"
|
||||
chmod +x "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_MULTI"
|
||||
|
||||
# Publish to multiple architectures
|
||||
PUBLISH_x86_64_OUTPUT=$("$GETPKG" publish "${UNPUBLISH_TOOL_MULTI}:x86_64" "$UNPUBLISH_TEST_DIR" 2>&1)
|
||||
PUBLISH_aarch64_OUTPUT=$("$GETPKG" publish "${UNPUBLISH_TOOL_MULTI}:aarch64" "$UNPUBLISH_TEST_DIR" 2>&1)
|
||||
PUBLISH_universal_OUTPUT=$("$GETPKG" publish "${UNPUBLISH_TOOL_MULTI}:universal" "$UNPUBLISH_TEST_DIR" 2>&1)
|
||||
|
||||
if [[ "$PUBLISH_x86_64_OUTPUT" =~ Published! ]] && [[ "$PUBLISH_aarch64_OUTPUT" =~ Published! ]] && [[ "$PUBLISH_universal_OUTPUT" =~ Published! ]]; then
|
||||
# Test robust unpublish - should remove ALL architectures
|
||||
sleep 1 # Give server time to process all publishes
|
||||
UNPUBLISH_OUTPUT=$("$GETPKG" unpublish "$UNPUBLISH_TOOL_MULTI" 2>&1)
|
||||
UNPUBLISH_EXIT_CODE=$?
|
||||
|
||||
# Check that unpublish found and removed packages
|
||||
if [ $UNPUBLISH_EXIT_CODE -eq 0 ] && [[ "$UNPUBLISH_OUTPUT" =~ "Found" ]] && [[ "$UNPUBLISH_OUTPUT" =~ "Successfully unpublished" ]]; then
|
||||
print_test_result "Unpublish removes all architectures" 0
|
||||
else
|
||||
print_test_result "Unpublish removes all architectures" 1
|
||||
echo " Unpublish failed: $UNPUBLISH_OUTPUT"
|
||||
fi
|
||||
else
|
||||
print_test_result "Unpublish removes all architectures" 1
|
||||
echo " Failed to publish test tool to multiple architectures"
|
||||
echo " x86_64: $PUBLISH_x86_64_OUTPUT"
|
||||
echo " aarch64: $PUBLISH_aarch64_OUTPUT"
|
||||
echo " universal: $PUBLISH_universal_OUTPUT"
|
||||
fi
|
||||
|
||||
# Test 13.5b: Unpublish tool with universal architecture
|
||||
echo "Test 13.5b: Unpublish tool with universal architecture"
|
||||
echo '#!/bin/bash
|
||||
echo "Universal arch unpublish test"' > "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_CUSTOM"
|
||||
chmod +x "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_CUSTOM"
|
||||
|
||||
# Publish with universal architecture
|
||||
PUBLISH_CUSTOM_OUTPUT=$("$GETPKG" publish "${UNPUBLISH_TOOL_CUSTOM}:universal" "$UNPUBLISH_TEST_DIR" 2>&1)
|
||||
|
||||
if [[ "$PUBLISH_CUSTOM_OUTPUT" =~ Published! ]]; then
|
||||
# Test that unpublish can find and remove custom tags
|
||||
UNPUBLISH_CUSTOM_OUTPUT=$("$GETPKG" unpublish "$UNPUBLISH_TOOL_CUSTOM" 2>&1)
|
||||
UNPUBLISH_CUSTOM_EXIT_CODE=$?
|
||||
|
||||
if [ $UNPUBLISH_CUSTOM_EXIT_CODE -eq 0 ] && [[ "$UNPUBLISH_CUSTOM_OUTPUT" =~ Found\ ${UNPUBLISH_TOOL_CUSTOM}:universal ]]; then
|
||||
print_test_result "Unpublish finds universal architecture" 0
|
||||
else
|
||||
print_test_result "Unpublish finds universal architecture" 1
|
||||
echo " Failed to find or unpublish custom tag: $UNPUBLISH_CUSTOM_OUTPUT"
|
||||
fi
|
||||
else
|
||||
print_test_result "Unpublish finds universal architecture" 1
|
||||
echo " Failed to publish tool with custom tag: $PUBLISH_CUSTOM_OUTPUT"
|
||||
fi
|
||||
|
||||
# Test 13.5c: Unpublish non-existent tool
|
||||
echo "Test 13.5c: Unpublish non-existent tool"
|
||||
NON_EXISTENT_TOOL="non-existent-tool-$RANDOM"
|
||||
UNPUBLISH_MISSING_OUTPUT=$("$GETPKG" unpublish "$NON_EXISTENT_TOOL" 2>&1)
|
||||
UNPUBLISH_MISSING_EXIT_CODE=$?
|
||||
|
||||
if [ $UNPUBLISH_MISSING_EXIT_CODE -ne 0 ] && [[ "$UNPUBLISH_MISSING_OUTPUT" =~ "No packages found" ]]; then
|
||||
print_test_result "Unpublish handles missing tools gracefully" 0
|
||||
else
|
||||
print_test_result "Unpublish handles missing tools gracefully" 1
|
||||
echo " Expected failure for non-existent tool, got: $UNPUBLISH_MISSING_OUTPUT"
|
||||
fi
|
||||
|
||||
# Test 13.5d: Unpublish by hash
|
||||
echo "Test 13.5d: Unpublish by hash"
|
||||
UNPUBLISH_TOOL_HASH="${UNPUBLISH_TOOL_BASE}-hash"
|
||||
echo '#!/bin/bash
|
||||
echo "Hash unpublish test"' > "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_HASH"
|
||||
chmod +x "$UNPUBLISH_TEST_DIR/$UNPUBLISH_TOOL_HASH"
|
||||
|
||||
PUBLISH_HASH_OUTPUT=$("$GETPKG" publish "${UNPUBLISH_TOOL_HASH}:x86_64" "$UNPUBLISH_TEST_DIR" 2>&1)
|
||||
|
||||
if [[ "$PUBLISH_HASH_OUTPUT" =~ Hash:\ ([0-9]+) ]]; then
|
||||
EXTRACTED_HASH="${BASH_REMATCH[1]}"
|
||||
|
||||
# Test unpublish by hash
|
||||
UNPUBLISH_HASH_OUTPUT=$("$GETPKG" unpublish "$EXTRACTED_HASH" 2>&1)
|
||||
UNPUBLISH_HASH_EXIT_CODE=$?
|
||||
|
||||
if [ $UNPUBLISH_HASH_EXIT_CODE -eq 0 ] && [[ "$UNPUBLISH_HASH_OUTPUT" =~ "Successfully unpublished hash" ]]; then
|
||||
print_test_result "Unpublish by hash works" 0
|
||||
else
|
||||
print_test_result "Unpublish by hash works" 1
|
||||
echo " Failed to unpublish by hash: $UNPUBLISH_HASH_OUTPUT"
|
||||
fi
|
||||
else
|
||||
print_test_result "Unpublish by hash works" 1
|
||||
echo " Could not extract hash from publish output"
|
||||
fi
|
||||
|
||||
# Cleanup unpublish test directory
|
||||
rm -rf "$UNPUBLISH_TEST_DIR"
|
||||
|
||||
else
|
||||
echo " Skipping unpublish tests (SOS_WRITE_TOKEN not set)"
|
||||
print_test_result "Unpublish removes all architectures" 0 # Pass as skipped
|
||||
print_test_result "Unpublish finds universal architecture" 0
|
||||
print_test_result "Unpublish handles missing tools gracefully" 0
|
||||
print_test_result "Unpublish by hash works" 0
|
||||
fi
|
||||
# Test 14: Invalid tool name validation
|
||||
echo -e "\nTest 14: Invalid tool name validation"
|
||||
INVALID_OUTPUT=$(timeout 3 "$GETPKG" install "../evil-tool" 2>&1)
|
||||
|
@ -1 +0,0 @@
|
||||
#!/bin/bash\necho debug
|
@ -1 +0,0 @@
|
||||
#!/bin/bash\necho debug2
|
@ -1 +0,0 @@
|
||||
test content
|
72
gp/gp
72
gp/gp
@ -225,19 +225,77 @@ show_status_and_confirm() {
|
||||
|
||||
# Show staged changes
|
||||
if ! git diff --cached --quiet; then
|
||||
print_info "Staged changes:"
|
||||
git diff --cached --name-only -- | while IFS= read -r line; do echo " $line"; done
|
||||
local staged_modified=""
|
||||
local staged_deleted=""
|
||||
local staged_added=""
|
||||
|
||||
# Get staged file status and categorize
|
||||
while IFS=$'\t' read -r status file; do
|
||||
[ -z "$status" ] && continue
|
||||
case "${status:0:1}" in
|
||||
A) staged_added="${staged_added}${file}\n" ;;
|
||||
M) staged_modified="${staged_modified}${file}\n" ;;
|
||||
D) staged_deleted="${staged_deleted}${file}\n" ;;
|
||||
*) staged_modified="${staged_modified}${file}\n" ;; # Default to modified for other statuses
|
||||
esac
|
||||
done < <(git diff --cached --name-status)
|
||||
|
||||
# Show staged added files
|
||||
if [ -n "$staged_added" ]; then
|
||||
print_info "Staged new files:"
|
||||
echo -e "$staged_added" | grep -v '^$' | while IFS= read -r line; do echo " $line"; done
|
||||
fi
|
||||
|
||||
# Show staged modified files
|
||||
if [ -n "$staged_modified" ]; then
|
||||
print_info "Staged modified files:"
|
||||
echo -e "$staged_modified" | grep -v '^$' | while IFS= read -r line; do echo " $line"; done
|
||||
fi
|
||||
|
||||
# Show staged deleted files
|
||||
if [ -n "$staged_deleted" ]; then
|
||||
print_info "Staged deleted files:"
|
||||
echo -e "$staged_deleted" | grep -v '^$' | while IFS= read -r line; do echo " $line"; done
|
||||
fi
|
||||
|
||||
has_staged_changes=true
|
||||
fi
|
||||
|
||||
# Show unstaged changes
|
||||
if ! git diff --quiet; then
|
||||
if [ "$ADD_ALL" = true ]; then
|
||||
print_info "Modified files (will be added):"
|
||||
else
|
||||
print_info "Modified files (unstaged, will NOT be included):"
|
||||
local modified_files=""
|
||||
local deleted_files=""
|
||||
|
||||
# Get file status and categorize
|
||||
while IFS=$'\t' read -r status file; do
|
||||
[ -z "$status" ] && continue
|
||||
case "${status:0:1}" in
|
||||
M) modified_files="${modified_files}${file}\n" ;;
|
||||
D) deleted_files="${deleted_files}${file}\n" ;;
|
||||
*) modified_files="${modified_files}${file}\n" ;; # Default to modified for other statuses
|
||||
esac
|
||||
done < <(git diff --name-status)
|
||||
|
||||
# Show modified files
|
||||
if [ -n "$modified_files" ]; then
|
||||
if [ "$ADD_ALL" = true ]; then
|
||||
print_info "Modified files (will be added):"
|
||||
else
|
||||
print_info "Modified files (unstaged, will NOT be included):"
|
||||
fi
|
||||
echo -e "$modified_files" | grep -v '^$' | while IFS= read -r line; do echo " $line"; done
|
||||
fi
|
||||
git diff --name-only -- | while IFS= read -r line; do echo " $line"; done
|
||||
|
||||
# Show deleted files
|
||||
if [ -n "$deleted_files" ]; then
|
||||
if [ "$ADD_ALL" = true ]; then
|
||||
print_info "Deleted files (will be removed):"
|
||||
else
|
||||
print_info "Deleted files (unstaged, will NOT be included):"
|
||||
fi
|
||||
echo -e "$deleted_files" | grep -v '^$' | while IFS= read -r line; do echo " $line"; done
|
||||
fi
|
||||
|
||||
has_unstaged_changes=true
|
||||
fi
|
||||
|
||||
|
20
sos/clean.sh
Executable file
20
sos/clean.sh
Executable file
@ -0,0 +1,20 @@
|
||||
#!/bin/bash
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
SCRIPT_DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )"
|
||||
PROJECT="sos"
|
||||
|
||||
echo "Cleaning ${PROJECT}..."
|
||||
|
||||
# Remove output directory (if it exists)
|
||||
if [ -d "${SCRIPT_DIR}/output" ]; then
|
||||
echo "Removing output directory..."
|
||||
rm -rf "${SCRIPT_DIR}/output"
|
||||
fi
|
||||
|
||||
# Remove any temporary files
|
||||
echo "Removing temporary files..."
|
||||
find "${SCRIPT_DIR}" -name "*.tmp" -o -name "*.temp" -o -name "*~" | xargs -r rm -f
|
||||
|
||||
echo "✓ ${PROJECT} cleaned successfully"
|
@ -25,6 +25,7 @@ GETPKG="${SCRIPT_DIR}/../getpkg/output/getpkg"
|
||||
TOOLDIR="${SCRIPT_DIR}/tool"
|
||||
mkdir -p "${TOOLDIR}"
|
||||
cp "${SCRIPT_DIR}/whatsdirty" "${TOOLDIR}/whatsdirty"
|
||||
cp "${SCRIPT_DIR}/setup_script.sh" "${TOOLDIR}/"
|
||||
|
||||
# publish universal tool.
|
||||
"${GETPKG}" publish "whatsdirty" "${TOOLDIR}"
|
||||
|
Reference in New Issue
Block a user