Add CI/CD pipeline, logging enhancements, and release management
Some checks failed
Lint Code / lint (push) Failing after 2m2s
CI/CD Pipeline / lint (push) Successful in 0s
Run Tests / test (3.12) (push) Successful in 54s
CI/CD Pipeline / build-and-release (push) Has been cancelled
CI/CD Pipeline / test (push) Has been cancelled
CI/CD Pipeline / notify (push) Has been cancelled
Run Tests / test (3.13) (push) Has been cancelled

- Create a GitHub Actions workflow for testing with Python 3.12 and 3.13.
- Update Makefile to include release management commands and pipeline checks.
- Document the CI/CD pipeline structure and usage in PIPELINE.md.
- Add structlog for structured logging and enhance logging utilities.
- Implement release management script for automated versioning and tagging.
- Modify logging configuration to support structured logging and improved formatting.
- Update dependencies in pyproject.toml and poetry.lock to include structlog.
- Enhance access logging in server and middleware to include structured data.
This commit is contained in:
Илья Глазунов 2025-09-03 00:13:21 +03:00
parent ff093b020f
commit 537b783726
16 changed files with 1054 additions and 241 deletions

View File

@ -1,4 +1,4 @@
[flake8] [flake8]
max-line-length = 120 max-line-length = 150
exclude = __pycache__,.git,.venv,venv,build,dist exclude = __pycache__,.git,.venv,venv,build,dist
ignore = E203,W503 ignore = E203,W503

View File

@ -0,0 +1,66 @@
# PyServeX v{VERSION}
## What's new in this version
### New Features
- [ ] Add description of new features
- [ ] List new commands or options
- [ ] Mention performance improvements
### Bug Fixes
- [ ] Describe fixed bugs
- [ ] Mention resolved security issues
- [ ] List compatibility fixes
### Technical Changes
- [ ] Dependency updates
- [ ] Code refactoring
- [ ] Architecture improvements
### Documentation
- [ ] README updates
- [ ] New usage examples
- [ ] API changes
## Installation
```bash
pip install pyserve=={VERSION}
```
## Usage
```bash
# Basic usage
pyserve
# With custom configuration
pyserve --config config.yaml
# In debug mode
pyserve --debug
```
## Migration from previous version
If you're upgrading from version v{PREVIOUS_VERSION}:
1. [ ] Describe necessary configuration changes
2. [ ] Mention deprecated functions
3. [ ] Provide migration examples
## Known Issues
- [ ] List known limitations
- [ ] Provide workarounds for issues
- [ ] Link to relevant issues
## Acknowledgments
Thanks to all contributors who made this version possible!
---
**Full changelog:** https://git.pyserve.org/Shifty/pyserveX/compare/v{PREVIOUS_VERSION}...v{VERSION}
**Documentation:** https://git.pyserve.org/Shifty/pyserveX/wiki
**Report a bug:** https://git.pyserve.org/Shifty/pyserveX/issues/new

View File

@ -0,0 +1,82 @@
# Automated Release Configuration
## How to use the pipeline
### 1. Linting (executed on every push)
```bash
# Triggers:
- push to any branch
- pull request to any branch
# Checks:
- Black (code formatting)
- isort (import sorting)
- flake8 (linting)
- mypy (type checking)
```
### 2. Tests (executed for dev, master, main)
```bash
# Triggers:
- push to branches: dev, master, main
- pull request to branches: dev, master, main
# Checks:
- pytest on Python 3.12 and 3.13
- coverage reports
```
### 3. Build and release (executed for tags)
```bash
# Triggers:
- push tag matching v*.*.*
- manual dispatch through Gitea interface
# Actions:
- Package build via Poetry
- Draft release creation
- Artifact upload (.whl and .tar.gz)
```
## Release workflow
1. **Release preparation:**
```bash
# Update version in pyproject.toml
poetry version patch # or minor/major
# Commit changes
git add pyproject.toml
git commit -m "bump version to $(poetry version -s)"
```
2. **Tag creation:**
```bash
# Create tag
git tag v$(poetry version -s)
git push origin v$(poetry version -s)
```
3. **Automatic process:**
- Pipeline starts automatically
- Linting and tests execute
- Package builds
- Draft release created
4. **Release finalization:**
- Go to Gitea interface
- Find created draft release
- Edit description according to template
- Publish release
## Environment variables
For correct pipeline operation, ensure:
- `GITHUB_TOKEN` - for release creation
- Repository permissions for release creation
## Customization
- Change Python versions in `test.yaml` if needed
- Add additional checks in `lint.yaml`
- Configure notifications in `pipeline.yaml`

View File

@ -0,0 +1,52 @@
name: Lint Code
run-name: ${{ gitea.actor }} started code linting
on:
push:
branches: ["*"]
pull_request:
branches: ["*"]
jobs:
lint:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.12'
- name: Install Poetry
uses: snok/install-poetry@v1
with:
version: latest
virtualenvs-create: true
virtualenvs-in-project: true
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --with dev
- name: Run Black (Code formatting check)
run: poetry run black --check pyserve/
- name: Run isort (Import sorting check)
run: poetry run isort --check-only pyserve/
- name: Run flake8 (Linting)
run: poetry run flake8 pyserve/
- name: Run mypy (Type checking)
run: poetry run mypy pyserve/
- name: Lint completed
run: echo "Code passed all linting checks!"

View File

@ -0,0 +1,49 @@
name: CI/CD Pipeline
run-name: ${{ gitea.actor }} started full pipeline
on:
push:
branches: ["*"]
tags: ["v*"]
pull_request:
branches: ["dev", "master", "main"]
workflow_dispatch:
inputs:
version:
description: 'Release version (e.g., v0.6.1)'
required: false
default: ''
jobs:
lint:
uses: ./.gitea/workflows/lint.yaml
test:
if: github.ref == 'refs/heads/dev' || github.ref == 'refs/heads/master' || github.ref == 'refs/heads/main' || github.event_name == 'pull_request'
needs: lint
uses: ./.gitea/workflows/test.yaml
build-and-release:
if: startsWith(github.ref, 'refs/tags/v') || github.event_name == 'workflow_dispatch'
needs: [lint, test]
uses: ./.gitea/workflows/release.yaml
with:
version: ${{ github.event.inputs.version }}
notify:
runs-on: ubuntu-latest
needs: [lint, test, build-and-release]
if: always()
steps:
- name: Pipeline Summary
run: |
echo "## Pipeline Execution Results" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "| Stage | Status |" >> $GITHUB_STEP_SUMMARY
echo "|-------|--------|" >> $GITHUB_STEP_SUMMARY
echo "| Linting | ${{ needs.lint.result == 'success' && 'Success' || 'Failed' }} |" >> $GITHUB_STEP_SUMMARY
echo "| Tests | ${{ needs.test.result == 'success' && 'Success' || needs.test.result == 'skipped' && 'Skipped' || 'Failed' }} |" >> $GITHUB_STEP_SUMMARY
echo "| Build and Release | ${{ needs.build-and-release.result == 'success' && 'Success' || needs.build-and-release.result == 'skipped' && 'Skipped' || 'Failed' }} |" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
if [[ "${{ needs.build-and-release.result }}" == "success" ]]; then
echo "**Draft release created!** Check and publish in Gitea interface." >> $GITHUB_STEP_SUMMARY
fi

View File

@ -1,19 +0,0 @@
name: Gitea Actions Demo
run-name: ${{ gitea.actor }} is testing out Gitea Actions 🚀
on: [push]
jobs:
Explore-Gitea-Actions:
runs-on: ubuntu-latest
steps:
- run: echo "🎉 The job was automatically triggered by a ${{ gitea.event_name }} event."
- run: echo "🐧 This job is now running on a ${{ runner.os }} server hosted by Gitea!"
- run: echo "🔎 The name of your branch is ${{ gitea.ref }} and your repository is ${{ gitea.repository }}."
- name: Check out repository code
uses: actions/checkout@v4
- run: echo "💡 The ${{ gitea.repository }} repository has been cloned to the runner."
- run: echo "🖥️ The workflow is now ready to test your code on the runner."
- name: List files in the repository
run: |
ls ${{ gitea.workspace }}
- run: echo "🍏 This job's status is ${{ job.status }}."

View File

@ -0,0 +1,155 @@
name: Build and Release
run-name: ${{ gitea.actor }} preparing release
on:
push:
tags:
- 'v*'
workflow_dispatch:
inputs:
version:
description: 'Release version (e.g., v0.6.1)'
required: true
default: 'v0.6.1'
jobs:
build:
runs-on: ubuntu-latest
needs: []
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Setup Python
uses: actions/setup-python@v4
with:
python-version: '3.12'
- name: Install Poetry
uses: snok/install-poetry@v1
with:
version: latest
virtualenvs-create: true
virtualenvs-in-project: true
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --with dev
- name: Build package
run: |
poetry build
echo "Package built successfully!"
ls -la dist/
- name: Generate changelog
id: changelog
run: |
echo "## What's new in this version" > CHANGELOG.md
echo "" >> CHANGELOG.md
echo "### New Features" >> CHANGELOG.md
echo "- Add description of new features" >> CHANGELOG.md
echo "" >> CHANGELOG.md
echo "### Bug Fixes" >> CHANGELOG.md
echo "- Add description of bug fixes" >> CHANGELOG.md
echo "" >> CHANGELOG.md
echo "### Technical Changes" >> CHANGELOG.md
echo "- Add description of technical improvements" >> CHANGELOG.md
echo "" >> CHANGELOG.md
echo "### Dependencies" >> CHANGELOG.md
echo "- Updated dependencies to latest versions" >> CHANGELOG.md
echo "" >> CHANGELOG.md
echo "---" >> CHANGELOG.md
echo "**Full Changelog:** https://gitea.example.com/${{ gitea.repository }}/compare/v0.5.0...${{ github.ref_name }}" >> CHANGELOG.md
- name: Upload build artifacts
uses: actions/upload-artifact@v3
with:
name: dist-${{ github.ref_name }}
path: |
dist/
CHANGELOG.md
- name: Build completed
run: echo "Build completed! Artifacts ready for release."
release:
runs-on: ubuntu-latest
needs: build
if: startsWith(github.ref, 'refs/tags/v') || github.event_name == 'workflow_dispatch'
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Download build artifacts
uses: actions/download-artifact@v3
with:
name: dist-${{ github.ref_name || github.event.inputs.version }}
- name: Read changelog
id: changelog
run: |
if [ -f CHANGELOG.md ]; then
echo "CHANGELOG<<EOF" >> $GITHUB_OUTPUT
cat CHANGELOG.md >> $GITHUB_OUTPUT
echo "EOF" >> $GITHUB_OUTPUT
else
echo "CHANGELOG=Automatically generated release" >> $GITHUB_OUTPUT
fi
- name: Create Release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref_name || github.event.inputs.version }}
release_name: PyServeX ${{ github.ref_name || github.event.inputs.version }}
body: |
${{ steps.changelog.outputs.CHANGELOG }}
## Installation
```bash
pip install pyserve==${{ github.ref_name || github.event.inputs.version }}
```
## Usage
```bash
pyserve --help
```
draft: true
prerelease: false
- name: Upload Release Asset (wheel)
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./dist/pyserve-*.whl
asset_name: pyserve-${{ github.ref_name || github.event.inputs.version }}.whl
asset_content_type: application/octet-stream
- name: Upload Release Asset (tarball)
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: ./dist/pyserve-*.tar.gz
asset_name: pyserve-${{ github.ref_name || github.event.inputs.version }}.tar.gz
asset_content_type: application/gzip
- name: Release created
run: echo "Draft release created! Check and publish in Gitea interface."

View File

@ -0,0 +1,56 @@
name: Run Tests
run-name: ${{ gitea.actor }} started tests
on:
push:
branches: ["dev", "master", "main"]
pull_request:
branches: ["dev", "master", "main"]
jobs:
test:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ['3.12', '3.13']
steps:
- name: Checkout repository
uses: actions/checkout@v4
- name: Setup Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install Poetry
uses: snok/install-poetry@v1
with:
version: latest
virtualenvs-create: true
virtualenvs-in-project: true
- name: Load cached venv
id: cached-poetry-dependencies
uses: actions/cache@v3
with:
path: .venv
key: venv-${{ runner.os }}-${{ matrix.python-version }}-${{ hashFiles('**/poetry.lock') }}
- name: Install dependencies
if: steps.cached-poetry-dependencies.outputs.cache-hit != 'true'
run: poetry install --with dev
- name: Run tests
run: poetry run pytest tests/ -v
- name: Run tests with coverage
run: poetry run pytest tests/ -v --cov=pyserve --cov-report=xml --cov-report=term
- name: Upload coverage to artifacts
uses: actions/upload-artifact@v3
with:
name: coverage-report-${{ matrix.python-version }}
path: coverage.xml
- name: Tests completed
run: echo "All tests passed successfully on Python ${{ matrix.python-version }}!"

View File

@ -1,4 +1,4 @@
.PHONY: help install build clean test lint format run dev-install dev-deps check .PHONY: help install build clean test lint format run dev-install dev-deps check release-patch release-minor release-major pipeline-check
PYTHON = python3 PYTHON = python3
POETRY = poetry POETRY = poetry
@ -55,6 +55,12 @@ help:
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "config-create" "Creating config.yaml" @printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "config-create" "Creating config.yaml"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "watch-logs" "Last server logs" @printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "watch-logs" "Last server logs"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "init" "Project initialized for development" @printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "init" "Project initialized for development"
@echo ""
@echo "$(YELLOW)Release Management:$(NC)"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "release-patch" "Create patch release (x.x.X)"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "release-minor" "Create minor release (x.X.0)"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "release-major" "Create major release (X.0.0)"
@printf " $(YELLOW)%-20s$(CYAN) %s$(NC)\n" "pipeline-check" "Run all pipeline checks locally"
@echo "$(GREEN)╚══════════════════════════════════════════════════════════════════════════════╝$(NC)" @echo "$(GREEN)╚══════════════════════════════════════════════════════════════════════════════╝$(NC)"
install: install:
@ -169,4 +175,26 @@ watch-logs:
init: dev-install config-create init: dev-install config-create
@echo "$(GREEN)Project initialized for development!$(NC)" @echo "$(GREEN)Project initialized for development!$(NC)"
release-patch:
@echo "$(GREEN)Creating patch release...$(NC)"
@./scripts/release.sh patch
release-minor:
@echo "$(GREEN)Creating minor release...$(NC)"
@./scripts/release.sh minor
release-major:
@echo "$(GREEN)Creating major release...$(NC)"
@./scripts/release.sh major
pipeline-check:
@echo "$(GREEN)Checking pipeline locally...$(NC)"
@echo "$(YELLOW)Running lint checks...$(NC)"
@$(MAKE) lint
@echo "$(YELLOW)Running tests...$(NC)"
@$(MAKE) test
@echo "$(YELLOW)Building package...$(NC)"
@$(MAKE) build
@echo "$(GREEN)All pipeline checks passed!$(NC)"
.DEFAULT_GOAL := help .DEFAULT_GOAL := help

178
PIPELINE.md Normal file
View File

@ -0,0 +1,178 @@
# CI/CD Pipeline for PyServeX
This document describes the complete CI/CD pipeline for the PyServeX project, including linting, testing, building, and automated release creation.
## Pipeline Structure
### 1. Linting Stage (`lint.yaml`)
**Triggers:**
- Push to any branch
- Pull request to any branch
**Checks:**
- Black (code formatting)
- isort (import sorting)
- flake8 (code analysis)
- mypy (type checking)
### 2. Testing Stage (`test.yaml`)
**Triggers:**
- Push to branches: `dev`, `master`, `main`
- Pull request to branches: `dev`, `master`, `main`
**Checks:**
- pytest on Python 3.12 and 3.13
- Coverage report generation
- Artifact storage with reports
### 3. Build and Release Stage (`release.yaml`)
**Triggers:**
- Push tag matching `v*.*.*`
- Manual trigger through Gitea interface
**Actions:**
- Package build via Poetry
- Changelog generation
- Draft release creation
- Artifact upload (.whl and .tar.gz)
### 4. Main Pipeline (`pipeline.yaml`)
Coordinates execution of all stages and provides results summary.
## How to Use
### Local Development
```bash
# Environment initialization
make init
# Check all stages locally
make pipeline-check
# Code formatting
make format
# Run tests
make test-cov
```
### Creating a Release
#### Automatic method (recommended):
```bash
# Patch release (x.x.X)
make release-patch
# Minor release (x.X.0)
make release-minor
# Major release (X.0.0)
make release-major
```
#### Manual method:
```bash
# 1. Update version
poetry version patch # or minor/major
# 2. Commit changes
git add pyproject.toml
git commit -m "bump version to $(poetry version -s)"
# 3. Create tag
git tag v$(poetry version -s)
# 4. Push to server
git push origin main
git push origin v$(poetry version -s)
```
## Working with Releases
### What happens automatically:
1. **When tag** `v*.*.*` is created, pipeline starts
2. **Linting executes** - code quality check
3. **Tests run** - functionality verification
4. **Package builds** - wheel and tarball creation
5. **Draft release created** - automatic creation in Gitea
### What needs manual action:
1. **Go to Gitea interface** in Releases section
2. **Find created draft** release
3. **Edit description** according to template in `RELEASE_TEMPLATE.md`
4. **Publish release** (remove "Draft" status)
## Configuration
### Pipeline files:
- `.gitea/workflows/lint.yaml` - Linting
- `.gitea/workflows/test.yaml` - Testing
- `.gitea/workflows/release.yaml` - Build and release
- `.gitea/workflows/pipeline.yaml` - Main pipeline
- `.gitea/RELEASE_TEMPLATE.md` - Release template
### Scripts:
- `scripts/release.sh` - Automated release creation
- `Makefile` - Development and release commands
## Environment Setup
### Gitea Actions variables:
- `GITHUB_TOKEN` - for release creation (usually configured automatically)
### Access permissions:
- Repository release creation permissions
- Tag push permissions
## Monitoring
### Stage statuses:
- **Success** - stage completed successfully
- **Failure** - stage failed with error
- **Skipped** - stage skipped (e.g., tests for non-listed branches)
### Artifacts:
- **Coverage reports** - test coverage reports
- **Build artifacts** - built packages (.whl, .tar.gz)
- **Changelog** - automatically generated changelog
## Troubleshooting
### Common issues:
1. **Linting fails:**
```bash
make format # Auto-formatting
make lint # Check issues
```
2. **Tests fail:**
```bash
make test # Local test run
```
3. **Build error:**
```bash
make clean # Clean temporary files
make build # Rebuild
```
4. **Tag already exists:**
```bash
git tag -d v1.0.0 # Delete locally
git push origin :refs/tags/v1.0.0 # Delete on server
```
## Additional Resources
- [Poetry documentation](https://python-poetry.org/docs/)
- [Gitea Actions documentation](https://docs.gitea.io/en-us/actions/)
- [Pytest documentation](https://docs.pytest.org/)
- [Black documentation](https://black.readthedocs.io/)
---
**Author:** Ilya Glazunov
**Project:** PyServeX
**Documentation version:** 1.0

14
poetry.lock generated
View File

@ -653,6 +653,18 @@ typing-extensions = {version = ">=4.10.0", markers = "python_version < \"3.13\""
[package.extras] [package.extras]
full = ["httpx (>=0.27.0,<0.29.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.18)", "pyyaml"] full = ["httpx (>=0.27.0,<0.29.0)", "itsdangerous", "jinja2", "python-multipart (>=0.0.18)", "pyyaml"]
[[package]]
name = "structlog"
version = "25.4.0"
description = "Structured Logging for Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "structlog-25.4.0-py3-none-any.whl", hash = "sha256:fe809ff5c27e557d14e613f45ca441aabda051d119ee5a0102aaba6ce40eed2c"},
{file = "structlog-25.4.0.tar.gz", hash = "sha256:186cd1b0a8ae762e29417095664adf1d6a31702160a46dacb7796ea82f7409e4"},
]
[[package]] [[package]]
name = "types-pyyaml" name = "types-pyyaml"
version = "6.0.12.20250822" version = "6.0.12.20250822"
@ -960,4 +972,4 @@ dev = ["black", "flake8", "isort", "mypy", "pytest", "pytest-cov"]
[metadata] [metadata]
lock-version = "2.1" lock-version = "2.1"
python-versions = ">=3.12" python-versions = ">=3.12"
content-hash = "e145aef2574fcda0c0d45b8620988baf25f386a1b6ccf199c56210cbc0e3aa76" content-hash = "5eda39db8e3d119d03c8e6083d1f9cd14691669a7130fb17b1445a0dd7bb79e7"

View File

@ -13,6 +13,7 @@ dependencies = [
"uvicorn[standard] (>=0.35.0,<0.36.0)", "uvicorn[standard] (>=0.35.0,<0.36.0)",
"pyyaml (>=6.0,<7.0)", "pyyaml (>=6.0,<7.0)",
"types-pyyaml (>=6.0.12.20250822,<7.0.0.0)", "types-pyyaml (>=6.0.12.20250822,<7.0.0.0)",
"structlog (>=25.4.0,<26.0.0)",
] ]
[project.scripts] [project.scripts]
@ -34,7 +35,7 @@ requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api" build-backend = "poetry.core.masonry.api"
[tool.black] [tool.black]
line-length = 120 line-length = 150
target-version = ['py312'] target-version = ['py312']
include = '\.pyi?$' include = '\.pyi?$'
exclude = ''' exclude = '''

View File

@ -181,6 +181,18 @@ class Config:
) )
files_config.append(file_config) files_config.append(file_config)
if 'show_module' in console_format_data:
print(
"\033[33mWARNING: Parameter 'show_module' in console.format in development and may work incorrectly\033[0m"
)
console_config.format.show_module = console_format_data.get('show_module')
for i, file_data in enumerate(log_data.get('files', [])):
if 'format' in file_data and 'show_module' in file_data['format']:
print(
f"\033[33mWARNING: Parameter 'show_module' in files[{i}].format in development and may work incorrectly\033[0m"
)
if not files_config: if not files_config:
default_file_format = LogFormatConfig( default_file_format = LogFormatConfig(
type=global_format.type, type=global_format.type,

View File

@ -2,14 +2,15 @@ import logging
import logging.handlers import logging.handlers
import sys import sys
import time import time
import json
from pathlib import Path from pathlib import Path
from typing import Dict, Any, List from typing import Dict, Any, List, cast, Callable
import structlog
from structlog.types import FilteringBoundLogger, EventDict
from . import __version__ from . import __version__
class LoggerFilter(logging.Filter): class StructlogFilter(logging.Filter):
def __init__(self, logger_names: List[str]): def __init__(self, logger_names: List[str]):
super().__init__() super().__init__()
self.logger_names = logger_names self.logger_names = logger_names
@ -22,11 +23,10 @@ class LoggerFilter(logging.Filter):
for logger_name in self.logger_names: for logger_name in self.logger_names:
if record.name == logger_name or record.name.startswith(logger_name + '.'): if record.name == logger_name or record.name.startswith(logger_name + '.'):
return True return True
return False return False
class UvicornLogFilter(logging.Filter): class UvicornStructlogFilter(logging.Filter):
def filter(self, record: logging.LogRecord) -> bool: def filter(self, record: logging.LogRecord) -> bool:
if hasattr(record, 'name') and 'uvicorn.access' in record.name: if hasattr(record, 'name') and 'uvicorn.access' in record.name:
if hasattr(record, 'getMessage'): if hasattr(record, 'getMessage'):
@ -39,113 +39,75 @@ class UvicornLogFilter(logging.Filter):
if len(request_part) >= 2: if len(request_part) >= 2:
method_path = request_part[0] method_path = request_part[0]
status_part = request_part[1] status_part = request_part[1]
record.msg = f"Access: {client_info} - {method_path} - {status_part}" record.client = client_info
record.request = method_path
record.status = status_part
return True return True
class PyServeFormatter(logging.Formatter): def add_timestamp(logger: FilteringBoundLogger, method_name: str, event_dict: EventDict) -> EventDict:
COLORS = { event_dict["timestamp"] = time.strftime("%Y-%m-%d %H:%M:%S", time.localtime())
'DEBUG': '\033[36m', # Cyan return event_dict
'INFO': '\033[32m', # Green
'WARNING': '\033[33m', # Yellow
'ERROR': '\033[31m', # Red
'CRITICAL': '\033[35m', # Magenta
'RESET': '\033[0m' # Reset
}
def __init__(self, use_colors: bool = True, show_module: bool = True,
timestamp_format: str = "%Y-%m-%d %H:%M:%S", *args: Any, **kwargs: Any):
super().__init__(*args, **kwargs)
self.use_colors = use_colors and hasattr(sys.stderr, 'isatty') and sys.stderr.isatty()
self.show_module = show_module
self.timestamp_format = timestamp_format
def format(self, record: logging.LogRecord) -> str:
if self.use_colors:
levelname = record.levelname
if levelname in self.COLORS:
record.levelname = f"{self.COLORS[levelname]}{levelname}{self.COLORS['RESET']}"
if self.show_module and hasattr(record, 'name'):
name = record.name
if name.startswith('uvicorn'):
record.name = 'uvicorn'
elif name.startswith('pyserve'):
pass
elif name.startswith('starlette'):
record.name = 'starlette'
return super().format(record)
class PyServeJSONFormatter(logging.Formatter): def add_log_level(logger: FilteringBoundLogger, method_name: str, event_dict: EventDict) -> EventDict:
def __init__(self, timestamp_format: str = "%Y-%m-%d %H:%M:%S", *args: Any, **kwargs: Any): event_dict["level"] = method_name.upper()
super().__init__(*args, **kwargs) return event_dict
self.timestamp_format = timestamp_format
def format(self, record: logging.LogRecord) -> str:
log_entry = {
'timestamp': time.strftime(self.timestamp_format, time.localtime(record.created)),
'level': record.levelname,
'logger': record.name,
'message': record.getMessage(),
'module': record.module,
'function': record.funcName,
'line': record.lineno,
'thread': record.thread,
'thread_name': record.threadName,
}
if record.exc_info:
log_entry['exception'] = self.formatException(record.exc_info)
for key, value in record.__dict__.items():
if key not in ['name', 'msg', 'args', 'levelname', 'levelno', 'pathname',
'filename', 'module', 'lineno', 'funcName', 'created',
'msecs', 'relativeCreated', 'thread', 'threadName',
'processName', 'process', 'getMessage', 'exc_info', 'exc_text', 'stack_info']:
log_entry[key] = value
return json.dumps(log_entry, ensure_ascii=False, default=str)
class AccessLogHandler(logging.Handler): def add_module_info(logger: FilteringBoundLogger, method_name: str, event_dict: EventDict) -> EventDict:
def __init__(self, logger_name: str = 'pyserve.access'): if hasattr(logger, '_context') and 'logger_name' in logger._context:
super().__init__() logger_name = logger._context['logger_name']
self.access_logger = logging.getLogger(logger_name) if logger_name.startswith('pyserve'):
event_dict["module"] = logger_name
elif logger_name.startswith('uvicorn'):
event_dict["module"] = 'uvicorn'
elif logger_name.startswith('starlette'):
event_dict["module"] = 'starlette'
else:
event_dict["module"] = logger_name
return event_dict
def emit(self, record: logging.LogRecord) -> None:
self.access_logger.handle(record) def filter_module_info(show_module: bool) -> Callable[[FilteringBoundLogger, str, EventDict], EventDict]:
def processor(logger: FilteringBoundLogger, method_name: str, event_dict: EventDict) -> EventDict:
if not show_module and "module" in event_dict:
del event_dict["module"]
return event_dict
return processor
def colored_console_renderer(use_colors: bool = True, show_module: bool = True) -> structlog.dev.ConsoleRenderer:
return structlog.dev.ConsoleRenderer(
colors=use_colors and hasattr(sys.stderr, 'isatty') and sys.stderr.isatty(),
level_styles={
"critical": "\033[35m", # Magenta
"error": "\033[31m", # Red
"warning": "\033[33m", # Yellow
"info": "\033[32m", # Green
"debug": "\033[36m", # Cyan
},
pad_event=25,
)
def plain_console_renderer(show_module: bool = True) -> structlog.dev.ConsoleRenderer:
return structlog.dev.ConsoleRenderer(
colors=False,
pad_event=25,
)
def json_renderer() -> structlog.processors.JSONRenderer:
return structlog.processors.JSONRenderer(ensure_ascii=False, sort_keys=True)
class PyServeLogManager: class PyServeLogManager:
def __init__(self) -> None: def __init__(self) -> None:
self.configured = False self.configured = False
self.handlers: Dict[str, logging.Handler] = {} self.handlers: Dict[str, logging.Handler] = {}
self.loggers: Dict[str, logging.Logger] = {}
self.original_handlers: Dict[str, List[logging.Handler]] = {} self.original_handlers: Dict[str, List[logging.Handler]] = {}
self._structlog_configured = False
def _create_formatter(self, format_config: Dict[str, Any]) -> logging.Formatter:
format_type = format_config.get('type', 'standard').lower()
use_colors = format_config.get('use_colors', True)
show_module = format_config.get('show_module', True)
timestamp_format = format_config.get('timestamp_format', '%Y-%m-%d %H:%M:%S')
if format_type == 'json':
return PyServeJSONFormatter(timestamp_format=timestamp_format)
else:
if format_type == 'json':
fmt = None
else:
fmt = '%(asctime)s - %(name)s - %(levelname)s - [%(filename)s:%(lineno)d] - %(message)s'
return PyServeFormatter(
use_colors=use_colors,
show_module=show_module,
timestamp_format=timestamp_format,
fmt=fmt
)
def setup_logging(self, config: Dict[str, Any]) -> None: def setup_logging(self, config: Dict[str, Any]) -> None:
if self.configured: if self.configured:
@ -192,25 +154,85 @@ class PyServeLogManager:
self._save_original_handlers() self._save_original_handlers()
self._clear_all_handlers() self._clear_all_handlers()
root_logger = logging.getLogger() self._configure_structlog(
root_logger.setLevel(logging.DEBUG) main_level=main_level,
console_output=console_output,
console_format=console_format,
console_level=console_level,
files_config=files_config
)
self._configure_stdlib_loggers(main_level)
logger = self.get_logger('pyserve')
logger.info(
"PyServe logger initialized",
version=__version__,
level=main_level,
console_output=console_output,
console_format=console_format.get('type', 'standard')
)
for i, file_config in enumerate(files_config):
logger.info(
"File logging configured",
file_index=i,
path=file_config.get('path'),
level=file_config.get('level', main_level),
format_type=file_config.get('format', {}).get('type', 'standard')
)
self.configured = True
def _configure_structlog(
self,
main_level: str,
console_output: bool,
console_format: Dict[str, Any],
console_level: str,
files_config: List[Dict[str, Any]]
) -> None:
shared_processors = [
structlog.stdlib.filter_by_level,
add_timestamp,
add_log_level,
add_module_info,
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
]
if console_output: if console_output:
console_handler = logging.StreamHandler(sys.stdout) console_show_module = console_format.get('show_module', True)
console_handler.setLevel(getattr(logging, console_level)) console_processors = shared_processors.copy()
console_processors.append(filter_module_info(console_show_module))
if console_format.get('type') == 'json': if console_format.get('type') == 'json':
console_formatter = self._create_formatter(console_format) console_processors.append(json_renderer())
else: else:
console_formatter = PyServeFormatter( console_processors.append(
use_colors=console_format.get('use_colors', True), colored_console_renderer(
show_module=console_format.get('show_module', True), console_format.get('use_colors', True),
timestamp_format=console_format.get('timestamp_format', '%Y-%m-%d %H:%M:%S'), console_show_module
fmt='%(asctime)s - %(name)s - %(levelname)s - %(message)s' )
) )
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setLevel(getattr(logging, console_level))
console_handler.addFilter(UvicornStructlogFilter())
console_formatter = structlog.stdlib.ProcessorFormatter(
processor=colored_console_renderer(
console_format.get('use_colors', True),
console_show_module
)
if console_format.get('type') != 'json'
else json_renderer(),
)
console_handler.setFormatter(console_formatter) console_handler.setFormatter(console_formatter)
console_handler.addFilter(UvicornLogFilter())
root_logger = logging.getLogger()
root_logger.setLevel(logging.DEBUG)
root_logger.addHandler(console_handler) root_logger.addHandler(console_handler)
self.handlers['console'] = console_handler self.handlers['console'] = console_handler
@ -220,7 +242,8 @@ class PyServeLogManager:
file_loggers = file_config.get('loggers', []) file_loggers = file_config.get('loggers', [])
max_bytes = file_config.get('max_bytes', 10 * 1024 * 1024) max_bytes = file_config.get('max_bytes', 10 * 1024 * 1024)
backup_count = file_config.get('backup_count', 5) backup_count = file_config.get('backup_count', 5)
file_format = {**global_format, **file_config.get('format', {})} file_format = file_config.get('format', {})
file_show_module = file_format.get('show_module', True)
self._ensure_log_directory(file_path) self._ensure_log_directory(file_path)
@ -232,50 +255,58 @@ class PyServeLogManager:
) )
file_handler.setLevel(getattr(logging, file_level)) file_handler.setLevel(getattr(logging, file_level))
if file_format.get('type') == 'json':
file_formatter = self._create_formatter(file_format)
else:
file_formatter = PyServeFormatter(
use_colors=file_format.get('use_colors', False),
show_module=file_format.get('show_module', True),
timestamp_format=file_format.get('timestamp_format', '%Y-%m-%d %H:%M:%S'),
fmt='%(asctime)s - %(name)s - %(levelname)s - [%(filename)s:%(lineno)d] - %(message)s'
)
file_handler.setFormatter(file_formatter)
file_handler.addFilter(UvicornLogFilter())
if file_loggers: if file_loggers:
logger_filter = LoggerFilter(file_loggers) file_handler.addFilter(StructlogFilter(file_loggers))
file_handler.addFilter(logger_filter)
file_processors = shared_processors.copy()
file_processors.append(filter_module_info(file_show_module))
file_formatter = structlog.stdlib.ProcessorFormatter(
processor=json_renderer()
if file_format.get('type') == 'json'
else plain_console_renderer(file_show_module),
)
file_handler.setFormatter(file_formatter)
root_logger = logging.getLogger()
root_logger.addHandler(file_handler) root_logger.addHandler(file_handler)
self.handlers[f'file_{i}'] = file_handler self.handlers[f'file_{i}'] = file_handler
self._configure_library_loggers(main_level) base_processors = [
self._intercept_uvicorn_logging() structlog.stdlib.filter_by_level,
add_timestamp,
add_log_level,
add_module_info,
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
]
pyserve_logger = logging.getLogger('pyserve') structlog.configure(
pyserve_logger.setLevel(getattr(logging, main_level)) processors=cast(Any, base_processors + [structlog.stdlib.ProcessorFormatter.wrap_for_formatter]),
self.loggers['pyserve'] = pyserve_logger context_class=dict,
logger_factory=structlog.stdlib.LoggerFactory(),
wrapper_class=structlog.stdlib.BoundLogger,
cache_logger_on_first_use=True,
)
pyserve_logger.info(f"PyServe v{__version__} - Logger initialized") self._structlog_configured = True
pyserve_logger.info(f"Logging level: {main_level}")
pyserve_logger.info(f"Console output: {'enabled' if console_output else 'disabled'}")
pyserve_logger.info(f"Console format: {console_format.get('type', 'standard')}")
for i, file_config in enumerate(files_config): def _configure_stdlib_loggers(self, main_level: str) -> None:
file_path = file_config.get('path', './logs/pyserve.log') library_configs = {
file_loggers = file_config.get('loggers', []) 'uvicorn': 'DEBUG' if main_level == 'DEBUG' else 'WARNING',
file_format = file_config.get('format', {}) 'uvicorn.access': 'DEBUG' if main_level == 'DEBUG' else 'WARNING',
'uvicorn.error': 'DEBUG' if main_level == 'DEBUG' else 'ERROR',
'uvicorn.asgi': 'DEBUG' if main_level == 'DEBUG' else 'WARNING',
'starlette': 'DEBUG' if main_level == 'DEBUG' else 'WARNING',
'asyncio': 'WARNING',
'concurrent.futures': 'WARNING',
'multiprocessing': 'WARNING',
}
pyserve_logger.info(f"Log file[{i}]: {file_path}") for logger_name, level in library_configs.items():
pyserve_logger.info(f"File format[{i}]: {file_format.get('type', 'standard')}") logger = logging.getLogger(logger_name)
if file_loggers: logger.setLevel(getattr(logging, level))
pyserve_logger.info(f"File loggers[{i}]: {', '.join(file_loggers)}") logger.propagate = True
else:
pyserve_logger.info(f"File loggers[{i}]: all loggers")
self.configured = True
def _save_original_handlers(self) -> None: def _save_original_handlers(self) -> None:
logger_names = ['', 'uvicorn', 'uvicorn.access', 'uvicorn.error', 'starlette'] logger_names = ['', 'uvicorn', 'uvicorn.access', 'uvicorn.error', 'starlette']
@ -288,14 +319,12 @@ class PyServeLogManager:
root_logger = logging.getLogger() root_logger = logging.getLogger()
for handler in root_logger.handlers[:]: for handler in root_logger.handlers[:]:
root_logger.removeHandler(handler) root_logger.removeHandler(handler)
handler.close()
logger_names = ['uvicorn', 'uvicorn.access', 'uvicorn.error', 'starlette'] logger_names = ['uvicorn', 'uvicorn.access', 'uvicorn.error', 'starlette']
for name in logger_names: for name in logger_names:
logger = logging.getLogger(name) logger = logging.getLogger(name)
for handler in logger.handlers[:]: for handler in logger.handlers[:]:
logger.removeHandler(handler) logger.removeHandler(handler)
handler.close()
self.handlers.clear() self.handlers.clear()
@ -303,57 +332,29 @@ class PyServeLogManager:
log_dir = Path(log_file).parent log_dir = Path(log_file).parent
log_dir.mkdir(parents=True, exist_ok=True) log_dir.mkdir(parents=True, exist_ok=True)
def _configure_library_loggers(self, main_level: str) -> None: def get_logger(self, name: str) -> structlog.stdlib.BoundLogger:
library_configs = { if not self._structlog_configured:
# Uvicorn and related - only in DEBUG mode structlog.configure(
'uvicorn': 'DEBUG' if main_level == 'DEBUG' else 'WARNING', processors=cast(Any, [
'uvicorn.access': 'DEBUG' if main_level == 'DEBUG' else 'WARNING', structlog.stdlib.filter_by_level,
'uvicorn.error': 'DEBUG' if main_level == 'DEBUG' else 'ERROR', add_timestamp,
'uvicorn.asgi': 'DEBUG' if main_level == 'DEBUG' else 'WARNING', add_log_level,
structlog.processors.StackInfoRenderer(),
structlog.processors.format_exc_info,
structlog.stdlib.ProcessorFormatter.wrap_for_formatter,
]),
context_class=dict,
logger_factory=structlog.stdlib.LoggerFactory(),
wrapper_class=structlog.stdlib.BoundLogger,
cache_logger_on_first_use=True,
)
self._structlog_configured = True
# Starlette - only in DEBUG mode return cast(structlog.stdlib.BoundLogger, structlog.get_logger(name).bind(logger_name=name))
'starlette': 'DEBUG' if main_level == 'DEBUG' else 'WARNING',
'asyncio': 'WARNING',
'concurrent.futures': 'WARNING',
'multiprocessing': 'WARNING',
'pyserve': main_level,
'pyserve.server': main_level,
'pyserve.routing': main_level,
'pyserve.extensions': main_level,
'pyserve.config': main_level,
}
for logger_name, level in library_configs.items():
logger = logging.getLogger(logger_name)
logger.setLevel(getattr(logging, level))
if logger_name.startswith('uvicorn') and logger_name != 'uvicorn':
logger.propagate = False
self.loggers[logger_name] = logger
def _intercept_uvicorn_logging(self) -> None:
uvicorn_logger = logging.getLogger('uvicorn')
uvicorn_access_logger = logging.getLogger('uvicorn.access')
for handler in uvicorn_logger.handlers[:]:
uvicorn_logger.removeHandler(handler)
for handler in uvicorn_access_logger.handlers[:]:
uvicorn_access_logger.removeHandler(handler)
uvicorn_logger.propagate = True
uvicorn_access_logger.propagate = True
def get_logger(self, name: str) -> logging.Logger:
if name not in self.loggers:
logger = logging.getLogger(name)
self.loggers[name] = logger
return self.loggers[name]
def set_level(self, logger_name: str, level: str) -> None: def set_level(self, logger_name: str, level: str) -> None:
if logger_name in self.loggers: logger = logging.getLogger(logger_name)
self.loggers[logger_name].setLevel(getattr(logging, level.upper())) logger.setLevel(getattr(logging, level.upper()))
def add_handler(self, name: str, handler: logging.Handler) -> None: def add_handler(self, name: str, handler: logging.Handler) -> None:
if name not in self.handlers: if name not in self.handlers:
@ -363,20 +364,32 @@ class PyServeLogManager:
def remove_handler(self, name: str) -> None: def remove_handler(self, name: str) -> None:
if name in self.handlers: if name in self.handlers:
handler = self.handlers[name]
root_logger = logging.getLogger() root_logger = logging.getLogger()
root_logger.removeHandler(self.handlers[name]) root_logger.removeHandler(handler)
self.handlers[name].close() handler.close()
del self.handlers[name] del self.handlers[name]
def create_access_log(self, method: str, path: str, status_code: int, def create_access_log(
response_time: float, client_ip: str, user_agent: str = "") -> None: self,
method: str,
path: str,
status_code: int,
response_time: float,
client_ip: str,
user_agent: str = ""
) -> None:
access_logger = self.get_logger('pyserve.access') access_logger = self.get_logger('pyserve.access')
access_logger.info(
log_message = f'{client_ip} - - [{time.strftime("%d/%b/%Y:%H:%M:%S %z")}] ' \ "HTTP access",
f'"{method} {path} HTTP/1.1" {status_code} - ' \ method=method,
f'"{user_agent}" {response_time:.3f}s' path=path,
status_code=status_code,
access_logger.info(log_message) response_time_ms=round(response_time * 1000, 2),
client_ip=client_ip,
user_agent=user_agent,
timestamp_format="access"
)
def shutdown(self) -> None: def shutdown(self) -> None:
for handler in self.handlers.values(): for handler in self.handlers.values():
@ -388,8 +401,8 @@ class PyServeLogManager:
for handler in handlers: for handler in handlers:
logger.addHandler(handler) logger.addHandler(handler)
self.loggers.clear()
self.configured = False self.configured = False
self._structlog_configured = False
log_manager = PyServeLogManager() log_manager = PyServeLogManager()
@ -399,12 +412,18 @@ def setup_logging(config: Dict[str, Any]) -> None:
log_manager.setup_logging(config) log_manager.setup_logging(config)
def get_logger(name: str) -> logging.Logger: def get_logger(name: str) -> structlog.stdlib.BoundLogger:
return log_manager.get_logger(name) return log_manager.get_logger(name)
def create_access_log(method: str, path: str, status_code: int, def create_access_log(
response_time: float, client_ip: str, user_agent: str = "") -> None: method: str,
path: str,
status_code: int,
response_time: float,
client_ip: str,
user_agent: str = ""
) -> None:
log_manager.create_access_log(method, path, status_code, response_time, client_ip, user_agent) log_manager.create_access_log(method, path, status_code, response_time, client_ip, user_agent)

View File

@ -48,7 +48,15 @@ class PyServeMiddleware:
status_code = response.status_code status_code = response.status_code
process_time = round((time.time() - start_time) * 1000, 2) process_time = round((time.time() - start_time) * 1000, 2)
self.access_logger.info(f"{client_ip} - {method} {path} - {status_code} - {process_time}ms") self.access_logger.info(
"HTTP request",
client_ip=client_ip,
method=method,
path=path,
status_code=status_code,
process_time_ms=process_time,
user_agent=request.headers.get("user-agent", "")
)
await response(scope, receive, send) await response(scope, receive, send)
@ -64,7 +72,7 @@ class PyServeServer:
def _setup_logging(self) -> None: def _setup_logging(self) -> None:
self.config.setup_logging() self.config.setup_logging()
logger.info("PyServe server initialized") logger.info("PyServe server initialized", version=__version__)
def _load_extensions(self) -> None: def _load_extensions(self) -> None:
for ext_config in self.config.extensions: for ext_config in self.config.extensions:
@ -106,7 +114,8 @@ class PyServeServer:
ext_metrics = getattr(extension, 'get_metrics')() ext_metrics = getattr(extension, 'get_metrics')()
metrics.update(ext_metrics) metrics.update(ext_metrics)
except Exception as e: except Exception as e:
logger.error(f"Error getting metrics from {type(extension).__name__}: {e}") logger.error("Error getting metrics from extension",
extension=type(extension).__name__, error=str(e))
import json import json
return Response( return Response(
@ -122,11 +131,11 @@ class PyServeServer:
return None return None
if not Path(self.config.ssl.cert_file).exists(): if not Path(self.config.ssl.cert_file).exists():
logger.error(f"SSL certificate not found: {self.config.ssl.cert_file}") logger.error("SSL certificate not found", cert_file=self.config.ssl.cert_file)
return None return None
if not Path(self.config.ssl.key_file).exists(): if not Path(self.config.ssl.key_file).exists():
logger.error(f"SSL key not found: {self.config.ssl.key_file}") logger.error("SSL key not found", key_file=self.config.ssl.key_file)
return None return None
try: try:
@ -138,7 +147,7 @@ class PyServeServer:
logger.info("SSL context created successfully") logger.info("SSL context created successfully")
return context return context
except Exception as e: except Exception as e:
logger.error(f"Error creating SSL context: {e}") logger.error("Error creating SSL context", error=str(e), exc_info=True)
return None return None
def run(self) -> None: def run(self) -> None:
@ -167,7 +176,12 @@ class PyServeServer:
else: else:
protocol = "http" protocol = "http"
logger.info(f"Starting PyServe server at {protocol}://{self.config.server.host}:{self.config.server.port}") logger.info(
"Starting PyServe server",
protocol=protocol,
host=self.config.server.host,
port=self.config.server.port
)
try: try:
assert self.app is not None, "App not initialized" assert self.app is not None, "App not initialized"
@ -175,7 +189,7 @@ class PyServeServer:
except KeyboardInterrupt: except KeyboardInterrupt:
logger.info("Received shutdown signal") logger.info("Received shutdown signal")
except Exception as e: except Exception as e:
logger.error(f"Error starting server: {e}") logger.error("Error starting server", error=str(e), exc_info=True)
finally: finally:
self.shutdown() self.shutdown()
@ -193,6 +207,7 @@ class PyServeServer:
log_level="critical", log_level="critical",
access_log=False, access_log=False,
use_colors=False, use_colors=False,
backlog=self.config.server.backlog if self.config.server.backlog else 2048,
) )
server = uvicorn.Server(config) server = uvicorn.Server(config)
@ -215,7 +230,7 @@ class PyServeServer:
for directory in directories: for directory in directories:
Path(directory).mkdir(parents=True, exist_ok=True) Path(directory).mkdir(parents=True, exist_ok=True)
logger.debug(f"Created/checked directory: {directory}") logger.debug("Created/checked directory", directory=directory)
def shutdown(self) -> None: def shutdown(self) -> None:
logger.info("Shutting down PyServe server") logger.info("Shutting down PyServe server")
@ -238,7 +253,8 @@ class PyServeServer:
ext_metrics = getattr(extension, 'get_metrics')() ext_metrics = getattr(extension, 'get_metrics')()
metrics.update(ext_metrics) metrics.update(ext_metrics)
except Exception as e: except Exception as e:
logger.error(f"Error getting metrics from {type(extension).__name__}: {e}") logger.error("Error getting metrics from extension",
extension=type(extension).__name__, error=str(e))
return metrics return metrics

106
scripts/release.sh Executable file
View File

@ -0,0 +1,106 @@
#!/bin/bash
# Release management script for PyServeX
# Usage: ./scripts/release.sh [patch|minor|major]
set -e
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_DIR="$(cd "$SCRIPT_DIR/.." && pwd)"
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
BLUE='\033[0;34m'
NC='\033[0m'
print_color() {
local color=$1
local message=$2
echo -e "${color}${message}${NC}"
}
if ! git rev-parse --git-dir > /dev/null 2>&1; then
print_color $RED "Error: Not in a git repository"
exit 1
fi
if [[ -n $(git status --porcelain) ]]; then
print_color $RED "Error: Working directory has uncommitted changes"
print_color $YELLOW "Commit all changes before creating a release"
git status --short
exit 1
fi
VERSION_TYPE=${1:-patch}
if [[ ! "$VERSION_TYPE" =~ ^(patch|minor|major)$ ]]; then
print_color $RED "Error: Invalid version type. Use: patch, minor or major"
exit 1
fi
print_color $BLUE "Starting release process..."
cd "$PROJECT_DIR"
CURRENT_VERSION=$(poetry version -s)
print_color $YELLOW "Current version: $CURRENT_VERSION"
print_color $BLUE "Updating version ($VERSION_TYPE)..."
poetry version $VERSION_TYPE
NEW_VERSION=$(poetry version -s)
print_color $GREEN "New version: $NEW_VERSION"
print_color $BLUE "Running tests..."
if ! poetry run pytest tests/ -v; then
print_color $RED "Tests failed. Rolling back changes..."
git checkout pyproject.toml
exit 1
fi
print_color $BLUE "Running linter checks..."
if ! make lint; then
print_color $RED "Linter found issues. Rolling back changes..."
git checkout pyproject.toml
exit 1
fi
print_color $BLUE "Building package..."
if ! poetry build; then
print_color $RED "Build failed. Rolling back changes..."
git checkout pyproject.toml
exit 1
fi
print_color $BLUE "Committing version change..."
git add pyproject.toml
git commit -m "bump version to $NEW_VERSION"
print_color $BLUE "Creating tag v$NEW_VERSION..."
git tag "v$NEW_VERSION"
print_color $YELLOW "Ready to push to server:"
print_color $YELLOW " - Commit with new version: $NEW_VERSION"
print_color $YELLOW " - Tag: v$NEW_VERSION"
echo
read -p "Push changes to server? (y/N): " -n 1 -r
echo
if [[ $REPLY =~ ^[Yy]$ ]]; then
print_color $BLUE "Pushing commit and tag to server..."
git push origin main
git push origin "v$NEW_VERSION"
print_color $GREEN "Release created successfully!"
print_color $GREEN "Version: $NEW_VERSION"
print_color $GREEN "Tag: v$NEW_VERSION pushed"
print_color $YELLOW "Pipeline will automatically create draft release in Gitea"
print_color $YELLOW "Don't forget to edit release description in Gitea interface"
else
print_color $YELLOW "Changes not pushed to server"
print_color $YELLOW "To push later, run:"
print_color $BLUE " git push origin main"
print_color $BLUE " git push origin v$NEW_VERSION"
fi
print_color $GREEN "Script completed!"