Filter: wpd_ai_analytics_data_fetch_batch_size
Control the batch size used when fetching analytics data from the database. This helps optimize query performance for large datasets.
Description
When fetching analytics data for reports, Alpha Insights processes records in batches to avoid memory exhaustion and optimize database performance. This filter allows you to customize the batch size based on your server capabilities and data volume.
Location
File: includes/classes/WPD_Data_Warehouse_React.php
Method: WPD_Data_Warehouse_React::get_analytics_data()
Line: ~6611
Parameters
| Parameter | Type | Description |
|---|---|---|
| $batch_size | int | The number of records to fetch per batch (default: 10000) |
Return
Type: int
Description: Modified batch size (must be a positive integer)
Example Usage
Increase Batch Size for Better Performance
add_filter( 'wpd_ai_analytics_data_fetch_batch_size', 'increase_analytics_batch_size' );
function increase_analytics_batch_size( $batch_size ) {
// Increase to 25000 for high-performance servers
return 25000;
}
Reduce Batch Size for Memory-Constrained Servers
add_filter( 'wpd_ai_analytics_data_fetch_batch_size', 'reduce_analytics_batch_size' );
function reduce_analytics_batch_size( $batch_size ) {
// Reduce to 5000 for servers with limited memory
return 5000;
}
Dynamic Batch Size Based on Data Volume
add_filter( 'wpd_ai_analytics_data_fetch_batch_size', 'dynamic_analytics_batch_size' );
function dynamic_analytics_batch_size( $batch_size ) {
global $wpdb;
// Get total analytics records count
$total_records = $wpdb->get_var(
"SELECT COUNT(*) FROM {$wpdb->prefix}wpd_ai_events"
);
// Adjust batch size based on total records
if ( $total_records > 1000000 ) {
// Very large dataset: use smaller batches
return 5000;
} elseif ( $total_records > 100000 ) {
// Large dataset: use medium batches
return 10000;
} else {
// Small dataset: use larger batches
return 20000;
}
}
Batch Size Based on Available Memory
add_filter( 'wpd_ai_analytics_data_fetch_batch_size', 'memory_based_analytics_batch_size' );
function memory_based_analytics_batch_size( $batch_size ) {
// Get available memory
$memory_limit = ini_get( 'memory_limit' );
$memory_limit_bytes = wp_convert_hr_to_bytes( $memory_limit );
$memory_usage = memory_get_usage( true );
$available_memory = $memory_limit_bytes - $memory_usage;
// Adjust batch size based on available memory
if ( $available_memory
Best Practices
- Default of 10,000 works well for most stores
- Monitor memory usage when increasing batch size
- Use smaller batches on shared hosting environments
- Larger batches = fewer queries but more memory usage
- Smaller batches = more queries but less memory usage
- Test with your actual data volume before production
Performance Impact
| Batch Size | Memory Usage | Query Count | Best For |
|---|---|---|---|
| 5,000 | Low | High | Shared hosting, low memory, very large datasets |
| 10,000 (default) | Medium | Medium | Most standard setups |
| 20,000-25,000 | High | Low | Dedicated servers, high memory, medium datasets |
Important Notes
- Batch size must be a positive integer
- Very large batch sizes may cause memory exhaustion
- Very small batch sizes may slow down queries
- Default value is optimized for most use cases
- Only adjust if you’re experiencing performance issues
- This affects analytics data fetching, not session data
Debugging
add_filter( 'wpd_ai_analytics_data_fetch_batch_size', 'debug_analytics_batch_size', 999 );
function debug_analytics_batch_size( $batch_size ) {
error_log( 'Alpha Insights Analytics Batch Size: ' . $batch_size );
error_log( 'Memory Usage: ' . size_format( memory_get_usage( true ) ) );
return $batch_size;
}
Related Filters
- wpd_ai_session_data_chunk_size – Control session data chunk size
- wpd_ai_report_filters_batch_size – Control report filters batch size
Related Classes
WPD_Data_Warehouse_React– React dashboard data warehouse