Installation Guideยค
This guide provides comprehensive instructions for installing Artifex on various platforms and configurations.
System Requirementsยค
Minimum Requirementsยค
- Python: 3.10 or higher
- RAM: 8GB (16GB recommended)
- Disk Space: 2GB for installation, 10GB+ for datasets
- Operating System: Linux (Ubuntu 20.04+), macOS (10.15+), Windows 10/11 (via WSL2)
Recommended Requirementsยค
- Python: 3.10 or 3.11 (tested and recommended)
- RAM: 16GB+ for training models
- GPU: NVIDIA GPU with 8GB+ VRAM (for GPU acceleration)
- Compute Capability 7.0+ (V100, RTX 20xx, RTX 30xx, RTX 40xx, A100)
- CUDA 12.0 or higher
- Disk Space: 50GB+ for large-scale experiments
Installation Optionsยค
Artifex provides multiple installation methods to suit different use cases:
The simplest way to get started:
# CPU-only installation
pip install artifex
# With GPU support
pip install artifex[cuda]
# With all extras (documentation, development tools)
pip install artifex[all]
Note: PyPI package coming soon. For now, install from source.
uv is a fast Python package manager. Artifex provides an automated setup script that handles everything:
Quick Setup (Recommended):
# Clone the repository
git clone https://github.com/avitai/artifex.git
cd artifex
# Run unified setup script (auto-detects GPU)
./setup.sh
# Activate environment
source ./activate.sh
What setup.sh does:
- Installs
uvpackage manager if not present - Detects GPU/CUDA availability automatically
- Creates
.venvvirtual environment - Installs all dependencies (
uv sync --extra allfor GPU or--extra devfor CPU) - Creates
.envfile with:- CUDA library paths (GPU mode)
- JAX platform configuration
- Environment variables for optimal performance
- Generates
activate.shscript for easy activation - Verifies installation with JAX GPU tests
What activate.sh does:
- Activates the
.venvvirtual environment - Loads environment variables from
.env - Configures CUDA paths (GPU mode)
- Verifies JAX installation and GPU detection
- Displays environment status and helpful commands
Setup Options:
# CPU-only setup (skip GPU detection)
./setup.sh --cpu-only
# Clean setup (removes caches)
./setup.sh --deep-clean
# Force reinstall
./setup.sh --force
# Verbose output
./setup.sh --verbose
Manual Setup (Advanced):
# Install uv if needed
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create virtual environment
uv venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install Artifex with all dependencies
uv sync --all-extras
# Or install specific extras
uv sync --extra cuda-dev # CUDA development environment
uv sync --extra dev # Development tools only
Using standard pip with Artifex's setup script:
Quick Setup with Scripts:
# Clone the repository
git clone https://github.com/avitai/artifex.git
cd artifex
# Run setup script (installs uv automatically)
./setup.sh
# Activate environment
source ./activate.sh
The setup script works with both uv and pip. It will:
- Auto-detect and install uv if needed
- Create virtual environment and configure CUDA
- Generate activation script with proper environment setup
Manual pip Installation:
# Clone the repository
git clone https://github.com/avitai/artifex.git
cd artifex
# Create and activate virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install in development mode
pip install -e .
# Or with extras
pip install -e '.[dev]' # Development tools
pip install -e '.[cuda]' # CUDA support
pip install -e '.[all]' # Everything
# For GPU support, manually configure environment:
export LD_LIBRARY_PATH=$PWD/.venv/lib/python3.*/site-packages/nvidia/*/lib:$LD_LIBRARY_PATH
export JAX_PLATFORMS="cuda,cpu"
export XLA_PYTHON_CLIENT_PREALLOCATE="false"
Note: Using ./setup.sh is recommended even for pip users as it properly configures CUDA paths and environment variables automatically.
For containerized deployment:
# Pull the latest image
docker pull ghcr.io/avitai/artifex:latest
# Run with GPU support
docker run --gpus all -it ghcr.io/avitai/artifex:latest
# Run with volume mount for data
docker run --gpus all -v $(pwd)/data:/workspace/data \
-it ghcr.io/avitai/artifex:latest
# Or build locally
docker build -t artifex:local .
docker run --gpus all -it artifex:local
Local Virtual Environment Setupยค
Artifex's setup.sh and activate.sh scripts provide automated local development environment configuration with GPU support.
Need Hardware-Specific Configuration?
For detailed information on customizing the setup for different GPUs (NVIDIA, AMD), TPUs, Apple Silicon, or multi-GPU systems, see the Hardware Setup Guide. This comprehensive guide explains how setup.sh and .env.example work and how to customize them for your specific hardware.
Understanding the Setup Processยค
When you run ./setup.sh, it creates a complete, self-contained development environment:
Files Created:
artifex/
โโโ .venv/ # Virtual environment
โ โโโ bin/activate # Standard venv activation
โ โโโ lib/python3.*/ # Python packages
โ โโโ ...
โโโ .env # Environment configuration
โโโ activate.sh # Unified activation script
โโโ uv.lock # Dependency lock file
The .env File (GPU Mode):
# CUDA library paths (from local venv installation)
export LD_LIBRARY_PATH=".venv/lib/python3.*/site-packages/nvidia/*/lib:$LD_LIBRARY_PATH"
# JAX GPU configuration
export JAX_PLATFORMS="cuda,cpu"
export XLA_PYTHON_CLIENT_PREALLOCATE="false"
export XLA_PYTHON_CLIENT_MEM_FRACTION="0.8"
# Project paths
export PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}$(pwd)"
export PYTEST_CUDA_ENABLED="true"
The .env File (CPU Mode):
# JAX CPU configuration
export JAX_PLATFORMS="cpu"
export JAX_ENABLE_X64="0"
# Project paths
export PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}$(pwd)"
export PYTEST_CUDA_ENABLED="false"
Using the Environmentยค
First Time Setup:
Daily Workflow:
# Simply activate the environment
source ./activate.sh
# Your environment is now ready with:
# - Virtual environment activated
# - CUDA paths configured (if GPU)
# - JAX optimally configured
# - Project in PYTHONPATH
# Work on your code...
uv run pytest tests/
python your_script.py
# Deactivate when done
deactivate
Activation Script Featuresยค
The activate.sh script provides:
- Smart Process Detection: Checks for running processes before deactivation
- GPU Verification: Tests GPU availability and displays device info
- Environment Status: Shows Python version, JAX backend, available devices
- Helpful Commands: Displays common development commands
- Error Handling: Provides clear messages if setup is incomplete
Example activation output:
๐ Activating Artifex Development Environment
=============================================
โ
Virtual environment activated
โ
Environment configuration loaded
๐ฎ GPU Mode: CUDA enabled
๐ Environment Status:
Python: Python 3.11.5
Virtual Environment: /path/to/artifex/.venv
๐งช JAX Configuration:
JAX version: 0.4.35
Default backend: gpu
Available devices: 1 total
๐ GPU devices: 1 (['cuda(id=0)'])
โ
CUDA acceleration ready!
๐ Ready for Development!
Setup Script Optionsยค
Customize setup for different scenarios:
# Standard setup (auto-detects GPU)
./setup.sh
# CPU-only setup (laptop/CI)
./setup.sh --cpu-only
# Clean installation (remove all caches)
./setup.sh --deep-clean
# Force reinstall over existing environment
./setup.sh --force
# Verbose output for debugging
./setup.sh --verbose
# Combine options
./setup.sh --force --deep-clean --verbose
Troubleshooting Local Setupยค
Problem: ./setup.sh: Permission denied
Problem: GPU not detected after setup
# Check NVIDIA drivers
nvidia-smi
# Re-run setup with force
./setup.sh --force
# Check activation output for GPU status
source ./activate.sh
Problem: Environment variables not loaded
# Verify .env file exists
cat .env
# Use 'source' not 'bash'
source ./activate.sh # โ
Correct
bash activate.sh # โ Won't load environment
GPU Setupยค
CUDA Installationยค
Artifex requires CUDA 12.0+ for GPU acceleration.
Linux (Ubuntu/Debian)ยค
# Check if CUDA is already installed
nvcc --version
nvidia-smi
# If not installed, download CUDA Toolkit 12.0+
# Visit: https://developer.nvidia.com/cuda-downloads
# Example for Ubuntu 22.04
wget https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-keyring_1.1-1_all.deb
sudo dpkg -i cuda-keyring_1.1-1_all.deb
sudo apt-get update
sudo apt-get -y install cuda-toolkit-12-0
# Add CUDA to PATH
echo 'export PATH=/usr/local/cuda/bin:$PATH' >> ~/.bashrc
echo 'export LD_LIBRARY_PATH=/usr/local/cuda/lib64:$LD_LIBRARY_PATH' >> ~/.bashrc
source ~/.bashrc
macOSยค
CUDA is not supported on macOS. Use Metal backend (experimental) or CPU mode.
Windows (WSL2)ยค
Windows users should use WSL2 with Ubuntu:
Automated GPU Setup (Linux)ยค
Artifex provides automated CUDA setup through the main setup script:
# Complete environment setup with CUDA (recommended)
./setup.sh
# This automatically:
# - Detects GPU and CUDA availability
# - Installs JAX with CUDA support
# - Configures CUDA library paths
# - Sets up JAX environment variables
# - Creates activation script with GPU verification
# - Tests GPU functionality
# For CPU-only systems:
./setup.sh --cpu-only
What gets configured automatically:
LD_LIBRARY_PATH: Points to CUDA libraries in.venv/lib/python3.*/site-packages/nvidia/*/libJAX_PLATFORMS: Set to"cuda,cpu"for GPU or"cpu"for CPU-onlyXLA_PYTHON_CLIENT_PREALLOCATE: Disabled for dynamic memory allocationXLA_PYTHON_CLIENT_MEM_FRACTION: Set to 0.8 (use 80% of GPU memory)- JAX CUDA plugins: Automatically matched to JAX version
See Local Virtual Environment Setup for detailed explanation.
JAX GPU Installationยค
After CUDA is installed:
# Install JAX with CUDA support
pip install "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
# Verify GPU is detected
python -c "import jax; print(jax.devices())"
# Should print: [cuda(id=0)] or similar
Troubleshooting GPU Installationยค
Problem: jax.devices() returns [cpu(id=0)] instead of GPU
Solutions:
- Set LD_LIBRARY_PATH:
- Use the setup script:
- Reinstall JAX with CUDA:
pip uninstall jax jaxlib
pip install "jax[cuda12_pip]" -f https://storage.googleapis.com/jax-releases/jax_cuda_releases.html
- Check CUDA installation:
nvcc --version # Should show CUDA version
nvidia-smi # Should show GPU info
python -c "import jax; print(jax.lib.xla_bridge.get_backend().platform)" # Should print 'gpu'
Problem: CUDA out of memory
Solutions:
- Reduce batch size
- Enable mixed precision training (BF16/FP16)
- Use gradient accumulation
- Clear GPU cache:
jax.clear_caches()
Problem: Slow training on GPU
Solutions:
- Ensure XLA is enabled (automatic with JAX)
- Use JIT compilation:
@jax.jit - Check GPU utilization:
nvidia-smi dmon - Increase batch size for better GPU utilization
TPU Setupยค
For Google Cloud TPU:
# Install JAX with TPU support
pip install "jax[tpu]" -f https://storage.googleapis.com/jax-releases/libtpu_releases.html
# Verify TPU is detected
python -c "import jax; print(jax.devices())"
# Should print TPU devices
Verificationยค
After installation, verify your setup:
# test_installation.py
import jax
import jax.numpy as jnp
from flax import nnx
from artifex.generative_models.models.vae import VAE
from artifex.generative_models.core.configuration import (
VAEConfig,
EncoderConfig,
DecoderConfig,
)
print("JAX version:", jax.__version__)
print("JAX backend:", jax.default_backend())
print("Available devices:", jax.devices())
# Test simple computation
x = jnp.array([1.0, 2.0, 3.0])
print("JAX computation test:", jnp.sum(x))
# Test Artifex imports
encoder = EncoderConfig(
name="test_encoder",
input_shape=(28, 28, 1),
latent_dim=32,
hidden_dims=(64, 128),
activation="relu",
)
decoder = DecoderConfig(
name="test_decoder",
latent_dim=32,
output_shape=(28, 28, 1),
hidden_dims=(128, 64),
activation="relu",
)
config = VAEConfig(
name="test_vae",
encoder=encoder,
decoder=decoder,
encoder_type="dense",
kl_weight=1.0,
)
rngs = nnx.Rngs(0)
model = VAE(config, rngs=rngs)
print("Artifex model created successfully!")
# Test forward pass (model uses internal RNGs)
batch = jax.random.normal(jax.random.PRNGKey(0), (4, 28, 28, 1))
outputs = model(batch)
print("Forward pass successful!")
print("Output keys:", list(outputs.keys()))
Run the verification:
Expected output:
JAX version: 0.8.2
JAX backend: gpu (or cpu)
Available devices: [CudaDevice(id=0)] (or [CpuDevice(id=0)])
JAX computation test: 6.0
Artifex model created successfully!
Forward pass successful!
Output keys: ['reconstructed', 'mean', 'log_var', 'z']
Development Installationยค
For contributing to Artifex:
Quick Development Setup:
# Clone repository
git clone https://github.com/avitai/artifex.git
cd artifex
# Run setup script (includes dev tools)
./setup.sh
# Activate environment
source ./activate.sh
# Install pre-commit hooks
uv run pre-commit install
# Verify development setup
uv run pytest tests/ -x # Run tests
uv run ruff check src/ # Linting
uv run ruff format src/ # Formatting
uv run pyright src/ # Type checking
What's included in development setup:
- All core dependencies (
uv sync --extra allor--extra dev) - Development tools: pytest, ruff, pyright, pre-commit
- GPU support (if available)
- Documentation tools (mkdocs, mkdocstrings)
- Benchmarking utilities
- Test coverage tools
Daily development workflow:
# Start your work session
source ./activate.sh
# Make changes to code
# ...
# Run tests before committing
uv run pytest tests/ -x
# Run pre-commit checks
uv run pre-commit run --all-files
# Commit your changes
git add .
git commit -m "Your commit message"
# Deactivate when done
deactivate
Platform-Specific Notesยค
Linuxยค
- Recommended: Ubuntu 20.04 LTS or Ubuntu 22.04 LTS
- Ensure NVIDIA drivers are up-to-date:
sudo ubuntu-drivers autoinstall - For multi-GPU: Set
CUDA_VISIBLE_DEVICES="0,1"
macOSยค
- Apple Silicon (M1/M2/M3): JAX has experimental support
- Intel Macs: CPU-only mode is fully supported
- XCode Command Line Tools required:
xcode-select --install
Windowsยค
- WSL2 Required: Native Windows support is limited
- Install Ubuntu 22.04 from Microsoft Store
- Follow Linux installation inside WSL2
- GPU support requires Windows 11 + CUDA in WSL2
Environment Variablesยค
Configure Artifex behavior with environment variables:
# JAX Configuration
export JAX_PLATFORMS=gpu # or 'cpu', 'tpu'
export XLA_PYTHON_CLIENT_PREALLOCATE=false # Disable memory preallocation
export XLA_PYTHON_CLIENT_MEM_FRACTION=0.75 # Use 75% of GPU memory
# Artifex Configuration
export ARTIFEX_CACHE_DIR=~/.cache/artifex # Cache directory
export ARTIFEX_DATA_DIR=~/artifex_data # Data directory
export ARTIFEX_LOG_LEVEL=INFO # Logging level
# Development
export ENABLE_BLACKJAX_TESTS=1 # Enable expensive BlackJAX tests
Add to ~/.bashrc or ~/.zshrc for persistence.
Common Issuesยค
Import Errorsยค
Problem: ModuleNotFoundError: No module named 'artifex'
Solutions:
# Verify installation
pip list | grep artifex
# Reinstall
pip install -e .
# Check Python path
python -c "import sys; print(sys.path)"
Version Conflictsยค
Problem: Dependency version conflicts
Solutions:
# Clean install
deactivate
rm -rf .venv uv.lock
uv venv
source .venv/bin/activate
uv sync --all-extras
Memory Issuesยค
Problem: Out of memory during installation
Solutions:
# Increase pip timeout and disable parallel builds
pip install --no-cache-dir -e .
# Or use uv which is more memory efficient
uv sync --all-extras
Updating Artifexยค
Keep Artifex up-to-date:
# From PyPI (when available)
pip install --upgrade artifex
# From source
cd artifex
git pull origin main
uv sync --all-extras # or: pip install -e .
Uninstallationยค
Remove Artifex completely:
# If installed from PyPI
pip uninstall artifex
# If installed from source
pip uninstall artifex
# Remove cache and data (optional)
rm -rf ~/.cache/artifex
rm -rf ~/artifex_data
# Remove virtual environment
deactivate
rm -rf .venv
Next Stepsยค
After successful installation:
- Quick Start: Follow the Quickstart Guide to run your first model
- Core Concepts: Learn about Artifex architecture
- First Model: Build a VAE from scratch with the First Model Tutorial
- Examples: Explore ready-to-run Examples
Getting Helpยค
If you encounter issues:
- Documentation: Check this guide and the documentation index
- GitHub Issues: Report bugs or request features
- Discussions: Ask questions
- Discord: Join our community (coming soon)
Hardware Recommendationsยค
For Research/Developmentยค
- CPU: 8+ cores (Intel i7/i9 or AMD Ryzen 7/9)
- RAM: 16-32GB
- GPU: NVIDIA RTX 3060 (12GB) or better
- Storage: SSD with 100GB+ free space
For Productionยค
- CPU: 16+ cores (Intel Xeon or AMD EPYC)
- RAM: 64GB+
- GPU: NVIDIA A100 (40/80GB) or H100
- Storage: NVMe SSD with 500GB+ free space
For Large-Scale Trainingยค
- Multi-GPU: 4-8x NVIDIA A100 or H100
- RAM: 256GB+
- Network: High-speed interconnect (NVLink, InfiniBand)
- Storage: Parallel file system (Lustre, GPFS)
Last Updated: 2025-10-13
Installation Support: For installation help, open an issue.