i was using a bunch of scripts like this last night for batch renames:
<?php
set_time_limit(3600);
require_once "inc_credentials.php";
require_once "inc_class_aws_s3.php";
$s3 = new S3($aws_access, $aws_secret);
$bucketName = 'lucas-photo';
///////////////////////////////////////
for ($i = 0; $i < 1000; $i++) {
$sourceFile = 'pictures/raw/'.str_pad($i, 8, '0', STR_PAD_LEFT).'.jpg';
$destinationFile = 'digital/raw/'.str_pad($i, 8, '0', STR_PAD_LEFT).'.jpg';
if ($s3->copyObject($bucketName, $sourceFile, $bucketName, $destinationFile, S3::ACL_PRIVATE)) {
echo "Copied file $sourceFile to $destinationFile \n<br />\n";
} else {
echo "Failed to copy file $sourceFile to $destinationFile \n<br />\n";
};
flush();
};
inc_class_aws_s3.php is
this
inc_class_aws_s3.php contains $aws_access and $aws_secret
i hope that's helpful!
i'm using this one now:
<?php
set_time_limit(3600);
require_once "inc_credentials.php";
require_once "inc_class_aws_s3.php";
$s3 = new S3($aws_access, $aws_secret);
$bucketName = 'lucas-photo';
///////////////////////////////////////
if (($contents = $s3->getBucket($bucketName, 'pictures/albums/')) === false) die('Could not get bucket.\n');
foreach ($contents as $object) {
$sourceFile = $object['name'];
$destinationFile = str_replace('pictures', 'digital', $object['name']);
if ($s3->copyObject($bucketName, $sourceFile, $bucketName, $destinationFile, S3::ACL_PRIVATE)) {
echo "Copied file $sourceFile to $destinationFile \n";
} else {
echo "Failed to copy file $sourceFile to $destinationFile \n";
};
flush();
};
object copy requests take forever with s3. even using their web management console. :(
bsdlite
thinks darkness is his ally
it's probably time to write a procedural wrapper around that S3 class