Provides a general boilerplate for chunked batch processes. @see http://drupal.org/node/180528 for a more detailed explanation of batches.
Get raw version
php
<?php function my_batch_name(){ $chunk_size = 120; // Items per run to process // Get the data array $data = get_data(); // Batch init $batch = array( 'title' => t('My title ...'), // Change to the batch title 'operations' => array(), 'init_message' => t('Commencing'), 'progress_message' => t('Processed @current out of @total.'), //'file' => drupal_get_path('module', 'mymodule') . '/includes/my_batch_name.inc', 'error_message' => t('An error occurred during processing'), 'finished' => 'batch_finished', 'progressive' => FALSE ); if ($data){ $chunks = array_chunk($data, $chunk_size); $count_chunks = count($chunks); $i = 1 ; foreach($chunks as $chunk){ $batch['operations'][] = array('process_chunk', array($chunk, 'details' => t('(Importing chunk @chunk of @count)', array('@chunk '=> $i, '@count'=> $count_chunks)))); //if ($i == 1) break; // Debugging purposes - Break on the first run. $i++; } } batch_set($batch); batch_process('<front>'); //Change to the desired batch finish page. } function get_data(){ $query = new EntityFieldQuery(); // do the query / get the data $results = $query->execute(); if (isset($results['node'])){ return array_keys($results['node']); } return array(); } // Batch operation: Process chunks. function process_chunk($data_chunk, $operation_details, &$context){ foreach($data_chunk as $data){ // Do the processing // $node = node_load($data); // node_save($node); } $context['message'] = $operation_details; //Will show what chunk we're on. } function batch_finished($success, $results, $operations){ if ($success) { $message = 'Success.'; // Customize success message. } else { $message = 'There were errors in the operations. Could not complete the process'; } drupal_set_message($message); }
Comments
dane commented 4 years 8 months ago
Thanks!
dane commented 4 years 8 months ago
thanks!
sbilde commented 4 years 7 months ago
Would it be possible to fire off multiple Batch-operation?
Scenario:
node add: Facebook Report (content type)
field: field_facebook_id
Multiple request (data):
http://graph.facebook.com/Page_ID/insights
http://graph.facebook.com/Page_ID/posts
..- Those should be processed and saved as either entities or rows directly in database (through the Data module)
Any ideas on how this is done?