Skip to content

Logs and Debugging

Easy Deploy provides comprehensive logging and debugging tools to help you troubleshoot issues, monitor application behavior, and optimize performance.

Your application’s console output and custom log messages:

  • Console Output: stdout and stderr from your application
  • Structured Logs: JSON formatted logs with metadata
  • Custom Logs: Application-specific logging
  • Framework Logs: Express, Django, Rails, etc. framework logs

Infrastructure and platform logs:

  • Deployment Logs: Build and deployment process logs
  • HTTP Access Logs: Request and response logs
  • Error Logs: System errors and exceptions
  • Performance Logs: Resource usage and timing data

Logs from connected services:

  • Database Logs: Query logs and connection events
  • CDN Logs: Content delivery and caching logs
  • Load Balancer Logs: Traffic distribution logs
  • Security Logs: Authentication and authorization events

View logs in your application dashboard:

  1. Navigate to your application
  2. Go to Logs section
  3. Select log type and time range
  4. Use filters to find specific entries

Access logs via Easy Deploy CLI:

Terminal window
# Install CLI
npm install -g @easydeploy/cli
# Login
easydeploy login
# View real-time logs
easydeploy logs APP_NAME --follow
# View logs for specific time range
easydeploy logs APP_NAME --since=1h
# Filter logs by level
easydeploy logs APP_NAME --level=error
# View specific number of lines
easydeploy logs APP_NAME --tail=100
# Export logs to file
easydeploy logs APP_NAME --since=24h > logs.txt

Access logs programmatically:

Terminal window
# Get application logs
curl -H "Authorization: Bearer YOUR_API_KEY" \
"https://api.easydeploy.com/v1/applications/APP_ID/logs?since=1h&level=error"
# Stream real-time logs (WebSocket)
curl -H "Authorization: Bearer YOUR_API_KEY" \
-H "Upgrade: websocket" \
"wss://api.easydeploy.com/v1/applications/APP_ID/logs/stream"
# Download log archive
curl -H "Authorization: Bearer YOUR_API_KEY" \
"https://api.easydeploy.com/v1/applications/APP_ID/logs/download?start=2024-01-01&end=2024-01-31" \
-o logs.zip

Use structured logging for better searchability:

Terminal window
// Node.js with Winston
const winston = require('winston');
const logger = winston.createLogger({
level: process.env.LOG_LEVEL || 'info',
format: winston.format.combine(
winston.format.timestamp(),
winston.format.errors({ stack: true }),
winston.format.json()
),
transports: [
new winston.transports.Console()
]
});
// Structured log example
logger.info('User authentication', {
event: 'user.login',
userId: user.id,
email: user.email,
ip: req.ip,
userAgent: req.get('User-Agent'),
duration: loginTime,
success: true
});
logger.error('Database connection failed', {
event: 'database.error',
database: 'postgresql',
host: process.env.DB_HOST,
error: error.message,
stack: error.stack,
retryAttempt: 3
});
Terminal window
import logging
import json
from datetime import datetime
# Configure structured logging
class StructuredFormatter(logging.Formatter):
def format(self, record):
log_entry = {
'timestamp': datetime.utcnow().isoformat(),
'level': record.levelname,
'message': record.getMessage(),
'module': record.module,
'function': record.funcName,
'line': record.lineno
}
if hasattr(record, 'extra_fields'):
log_entry.update(record.extra_fields)
return json.dumps(log_entry)
# Setup logger
logger = logging.getLogger(__name__)
handler = logging.StreamHandler()
handler.setFormatter(StructuredFormatter())
logger.addHandler(handler)
logger.setLevel(logging.INFO)
# Usage
logger.info('User created', extra={
'extra_fields': {
'event': 'user.created',
'user_id': user.id,
'email': user.email,
'ip': request.remote_addr
}
})
Terminal window
<?php
use Monolog\Logger;
use Monolog\Handler\StreamHandler;
use Monolog\Formatter\JsonFormatter;
// Create logger
$logger = new Logger('app');
$handler = new StreamHandler('php://stdout', Logger::INFO);
$handler->setFormatter(new JsonFormatter());
$logger->pushHandler($handler);
// Usage
$logger->info('User authentication', [
'event' => 'user.login',
'user_id' => $user->id,
'email' => $user->email,
'ip' => $_SERVER['REMOTE_ADDR'],
'success' => true
]);
?>

Use appropriate log levels for different types of events:

  • DEBUG: Detailed diagnostic information
  • INFO: General information about application flow
  • WARNING: Something unexpected happened, but application continues
  • ERROR: Error occurred, but application continues
  • CRITICAL: Serious error, application may stop
Terminal window
// DEBUG: Detailed diagnostic information
logger.debug('Processing user request', {
requestId: req.id,
method: req.method,
url: req.url,
params: req.params,
query: req.query
});
// INFO: Normal application flow
logger.info('User order completed', {
event: 'order.completed',
orderId: order.id,
userId: user.id,
amount: order.total,
items: order.items.length
});
// WARNING: Unexpected but recoverable
logger.warn('API rate limit approaching', {
event: 'rate_limit.warning',
current_requests: currentRequests,
limit: rateLimit,
user_id: user.id,
percentage: (currentRequests / rateLimit) * 100
});
// ERROR: Error occurred but app continues
logger.error('Payment processing failed', {
event: 'payment.error',
orderId: order.id,
userId: user.id,
amount: order.total,
error: error.message,
payment_provider: 'stripe',
retry_count: retryCount
});
// CRITICAL: Serious error requiring immediate attention
logger.critical('Database connection lost', {
event: 'database.critical',
database: 'postgresql',
host: process.env.DB_HOST,
error: error.message,
uptime: process.uptime(),
active_connections: connectionPool.totalCount
});

Easy Deploy provides powerful log search:

  • Full-text search: Search across all log messages
  • Field filtering: Filter by specific fields (level, timestamp, user ID)
  • Regular expressions: Use regex patterns for complex searches
  • Time range filtering: Search within specific time periods
Terminal window
# Search for specific error
level:error AND message:"payment failed"
# Search for specific user activity
userId:12345 AND event:user.*
# Search for slow requests
duration:>1000 AND event:request.completed
# Search for authentication events in last hour
event:user.login AND timestamp:>now-1h
# Search with wildcards
message:database* AND level:error
# Complex search with multiple conditions
(level:error OR level:critical) AND (message:timeout OR message:connection)

Analyze patterns in your logs:

  • Error trends: Track error rates over time
  • Performance metrics: Analyze response times and throughput
  • User behavior: Track user actions and patterns
  • System health: Monitor resource usage and system events

Implement comprehensive error tracking:

Terminal window
// Global error handler
process.on('uncaughtException', (error) => {
logger.critical('Uncaught exception', {
event: 'error.uncaught_exception',
error: error.message,
stack: error.stack,
process_id: process.pid,
memory_usage: process.memoryUsage(),
uptime: process.uptime()
});
// Give time for log to be written
setTimeout(() => process.exit(1), 1000);
});
process.on('unhandledRejection', (reason, promise) => {
logger.error('Unhandled promise rejection', {
event: 'error.unhandled_rejection',
reason: reason,
promise: promise,
stack: reason?.stack
});
});
// Express error handler
app.use((error, req, res, next) => {
const errorId = generateErrorId();
logger.error('HTTP request error', {
event: 'http.error',
error_id: errorId,
message: error.message,
stack: error.stack,
request: {
method: req.method,
url: req.url,
headers: req.headers,
body: req.body,
user: req.user
},
response_status: error.status || 500
});
res.status(error.status || 500).json({
error: 'Internal Server Error',
error_id: errorId,
message: process.env.NODE_ENV === 'development' ? error.message : undefined
});
});

Track requests across your application:

Terminal window
// Request tracing middleware
const { v4: uuidv4 } = require('uuid');
app.use((req, res, next) => {
req.traceId = uuidv4();
req.startTime = Date.now();
logger.info('HTTP request started', {
event: 'http.request.start',
trace_id: req.traceId,
method: req.method,
url: req.url,
user_agent: req.get('User-Agent'),
ip: req.ip,
user_id: req.user?.id
});
// Override console methods to include trace ID
const originalLog = console.log;
console.log = (...args) => {
originalLog(`[${req.traceId}]`, ...args);
};
res.on('finish', () => {
const duration = Date.now() - req.startTime;
logger.info('HTTP request completed', {
event: 'http.request.complete',
trace_id: req.traceId,
method: req.method,
url: req.url,
status: res.statusCode,
duration: duration,
response_size: res.get('Content-Length') || 0
});
});
next();
});

Log database operations for debugging:

Terminal window
// PostgreSQL query logging
const { Pool } = require('pg');
class LoggingPool extends Pool {
query(text, params, callback) {
const start = Date.now();
const traceId = getCurrentTraceId();
logger.debug('Database query started', {
event: 'database.query.start',
trace_id: traceId,
query: text,
params: params
});
return super.query(text, params, (err, result) => {
const duration = Date.now() - start;
if (err) {
logger.error('Database query failed', {
event: 'database.query.error',
trace_id: traceId,
query: text,
params: params,
error: err.message,
duration: duration
});
} else {
logger.debug('Database query completed', {
event: 'database.query.complete',
trace_id: traceId,
query: text,
duration: duration,
rows_affected: result.rowCount
});
}
if (callback) callback(err, result);
});
}
}

Configure log retention policies:

.easydeploy.yml
logging:
retention:
application_logs: 30d
access_logs: 90d
error_logs: 180d
debug_logs: 7d
compression:
enabled: true
algorithm: gzip
archival:
enabled: true
storage: s3
bucket: my-app-logs

Automatic log rotation to manage disk space:

  • Size-based rotation: Rotate logs when they reach a certain size
  • Time-based rotation: Rotate logs daily, weekly, or monthly
  • Compression: Compress old log files to save space
  • Cleanup: Automatically delete old log files

Export logs for external analysis:

Terminal window
# Export logs to external system
curl -H "Authorization: Bearer YOUR_API_KEY" \
"https://api.easydeploy.com/v1/applications/APP_ID/logs/export" \
-d '{
"destination": "s3",
"bucket": "my-log-bucket",
"format": "json",
"compression": "gzip",
"start_date": "2024-01-01",
"end_date": "2024-01-31"
}'
# Set up log forwarding
curl -X POST \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"type": "webhook",
"url": "https://my-log-service.com/webhook",
"format": "json",
"batch_size": 100,
"filters": {
"level": ["error", "critical"]
}
}' \
https://api.easydeploy.com/v1/applications/APP_ID/log-forwarding

Set up alerts based on log patterns:

Terminal window
# Log alert configuration
log_alerts:
- name: "High Error Rate"
condition: "level:error count > 10 in 5 minutes"
notification: ["[email protected]"]
- name: "Authentication Failures"
condition: "event:user.login.failed count > 5 in 1 minute"
notification: ["[email protected]"]
- name: "Database Errors"
condition: "event:database.error count > 3 in 1 minute"
notification: ["[email protected]"]
- name: "Slow Requests"
condition: "duration > 5000 count > 20 in 10 minutes"
notification: ["[email protected]"]

Create metrics from log data:

Terminal window
// Custom metrics from logs
const metrics = require('@easydeploy/metrics');
// Count specific events
logger.info('User signup', {
event: 'user.signup',
user_id: user.id,
plan: user.plan,
referrer: req.get('Referrer')
});
// This automatically creates metrics:
// - user.signup.count (counter)
// - user.signup.by_plan (grouped counter)
// Track response times
logger.info('API request completed', {
event: 'api.request.complete',
endpoint: req.route.path,
method: req.method,
duration: responseTime,
status: res.statusCode
});
// This creates metrics:
// - api.request.duration (histogram)
// - api.request.count (counter)
// - api.request.by_status (grouped counter)
Terminal window
# Find slow requests
level:info AND event:http.request.complete AND duration:>1000
# Identify slow database queries
level:debug AND event:database.query.complete AND duration:>500
# Check memory usage patterns
event:memory.usage AND heap_used:>100000000
Terminal window
# Find recent errors
level:error AND timestamp:>now-1h
# Trace specific error
trace_id:"550e8400-e29b-41d4-a716-446655440000"
# Find related errors
error:"Connection timeout" OR error:"ECONNRESET"
Terminal window
# Track specific user activity
user_id:12345 AND timestamp:>now-24h
# Find authentication issues
event:user.login.failed AND user_id:12345
# Track user's request flow
trace_id:"user-trace-id" ORDER BY timestamp
  1. Use structured logging for better searchability
  2. Include correlation IDs to trace requests
  3. Log at appropriate levels to avoid noise
  4. Include context in log messages
  5. Monitor log patterns for anomalies
  6. Set up alerts for critical issues
  7. Regularly review log retention policies

Forward logs to external services:

Terminal window
# Splunk integration
curl -X POST \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"type": "splunk",
"host": "splunk.company.com",
"port": 8088,
"token": "splunk-hec-token",
"index": "easydeploy"
}' \
https://api.easydeploy.com/v1/applications/APP_ID/log-forwarding
# ELK Stack integration
curl -X POST \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"type": "elasticsearch",
"host": "elasticsearch.company.com",
"port": 9200,
"index": "easydeploy-logs",
"username": "elastic",
"password": "password"
}' \
https://api.easydeploy.com/v1/applications/APP_ID/log-forwarding

Integrate with Security Information and Event Management systems:

  • Splunk: Enterprise log analysis
  • ELK Stack: Elasticsearch, Logstash, Kibana
  • Graylog: Open source log management
  • Sumo Logic: Cloud-native log analytics

Need help with logging? Check our troubleshooting guide or contact support.