How do you handle large file uploads efficiently in Laravel? #57271
-
Hey folks 👋 I’m working on an app that needs to handle file uploads larger than 100MB. Would love to hear how you approach this in your projects! Thanks 🙏 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Halo! Great question - handling large file uploads is a common challenge we all face in Laravel projects. Let me share how I typically approach this: Chunked Uploads StrategyFor files over 100MB, chunked uploads are definitely the way to go. I usually use Resumable.js or Dropzone.js on the frontend: // Example with Dropzone
Dropzone.options.myDropzone = {
chunking: true,
chunkSize: 10 * 1024 * 1024, // 10MB chunks
parallelChunkUploads: true,
retryChunks: true
}; Backend handling with Laravel: public function handleChunk(Request $request)
{
$file = $request->file('file');
$chunkIndex = $request->input('chunkIndex');
$totalChunks = $request->input('totalChunks');
$identifier = $request->input('identifier');
$path = storage_path("app/uploads/chunks/{$identifier}");
// Store chunk
$file->move($path, "chunk_{$chunkIndex}");
// If all chunks uploaded, combine them
if ($chunkIndex == $totalChunks - 1) {
return $this->combineChunks($identifier, $request->input('originalName'));
}
return response()->json(['progress' => ($chunkIndex + 1) / $totalChunks]);
} Configuration & PerformanceIncrease PHP Limits// In your controller or middleware
public function __construct()
{
ini_set('upload_max_filesize', '500M');
ini_set('post_max_size', '500M');
ini_set('max_execution_time', '300');
} Nginx Configurationclient_max_body_size 500M;
client_body_timeout 300s;
proxy_read_timeout 300s; Queue Processing for Heavy OperationsAlways process file validation, resizing, or scanning in queues: // Job for processing uploaded file
class ProcessLargeFile implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
public function __construct(public $filePath) {}
public function handle()
{
// Scan for viruses, generate thumbnails, etc.
$scanner = new VirusScanner();
$result = $scanner->scan($this->filePath);
if ($result->isClean()) {
// Move to final location
Storage::disk('s3')->put(
'uploads/' . basename($this->filePath),
fopen($this->filePath, 'r+')
);
// Clean up temporary file
unlink($this->filePath);
}
}
}
// In your controller
public function combineChunks($identifier, $filename)
{
// ... combine logic ...
ProcessLargeFile::dispatch($finalPath)->onQueue('file-processing');
} Storage StrategyS3 with Multipart Uploadsuse Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
public function uploadToS3($filePath)
{
$uploader = new MultipartUploader(
Storage::disk('s3')->getClient(),
$filePath,
[
'bucket' => config('filesystems.disks.s3.bucket'),
'key' => 'uploads/' . basename($filePath),
'before_initiate' => function ($command) {
$command['ACL'] = 'private';
}
]
);
try {
$result = $uploader->upload();
return $result['ObjectURL'];
} catch (MultipartUploadException $e) {
// Handle upload failure
Log::error('S3 upload failed: ' . $e->getMessage());
}
} Middleware for Large UploadsCreate custom middleware: <?php
namespace App\Http\Middleware;
use Closure;
class IncreaseUploadLimits
{
public function handle($request, Closure $next)
{
if ($request->is('upload/*')) {
ini_set('upload_max_filesize', '500M');
ini_set('post_max_size', '500M');
ini_set('max_execution_time', '300');
}
return $next($request);
}
} Recommended PackagesHere are some battle-tested packages: # For chunked uploads
composer require pion/laravel-chunk-upload
# For S3 multipart
composer require league/flysystem-aws-s3-v3
# For progress tracking
composer require spatie/laravel-activitylog Complete Example Controller<?php
namespace App\Http\Controllers;
use Illuminate\Http\Request;
use App\Jobs\ProcessLargeFile;
use Illuminate\Support\Facades\Storage;
class FileUploadController extends Controller
{
public function uploadChunk(Request $request)
{
$request->validate([
'file' => 'required|file',
'chunkIndex' => 'required|integer',
'totalChunks' => 'required|integer',
'identifier' => 'required|string'
]);
$chunk = $request->file('file');
$chunkIndex = $request->chunkIndex;
$identifier = $request->identifier;
$chunkPath = "chunks/{$identifier}/chunk_{$chunkIndex}";
Storage::put($chunkPath, file_get_contents($chunk->getRealPath()));
if ($chunkIndex == $request->totalChunks - 1) {
$finalPath = $this->combineChunks($identifier, $request->originalName);
ProcessLargeFile::dispatch($finalPath);
return response()->json(['status' => 'complete']);
}
return response()->json(['status' => 'chunk_uploaded']);
}
private function combineChunks($identifier, $filename)
{
$finalPath = storage_path("app/uploads/{$filename}");
$chunkPath = storage_path("app/chunks/{$identifier}");
$file = fopen($finalPath, 'wb');
for ($i = 0; $i < count(glob("{$chunkPath}/chunk_*")); $i++) {
$chunkFile = "{$chunkPath}/chunk_{$i}";
fwrite($file, file_get_contents($chunkFile));
unlink($chunkFile);
}
fclose($file);
rmdir($chunkPath);
return $finalPath;
}
} Key Takeaways:
Hope this helps! Would love to hear how others in the community are handling this too. Ada yang punya approach berbeda? 🚀 |
Beta Was this translation helpful? Give feedback.
Halo! Great question - handling large file uploads is a common challenge we all face in Laravel projects. Let me share how I typically approach this:
Chunked Uploads Strategy
For files over 100MB, chunked uploads are definitely the way to go. I usually use Resumable.js or Dropzone.js on the frontend:
Backend handling with Laravel: