File System
VoidRun sandboxes come with a complete file system that persists throughout the sandbox lifecycle. Upload files, create directories, and manage data just like a regular Linux environment.
Ephemeral vs. Persistent Storage
Understanding Storage Behavior
Files in a sandbox persist as long as the sandbox exists. When you delete a sandbox, all files are permanently removed.
State File Persistence Notes Running ✅ Full access Read, write, modify files freely Paused ✅ Preserved Files saved, sandbox inactive Sleeping ✅ Preserved Files saved, auto-sleep mode Stopped ✅ Preserved Files saved, sandbox halted Deleted ❌ Lost All data permanently removed
Storage Characteristics
📁 Ephemeral by Default Files exist only within the sandbox lifecycle. Delete the sandbox, lose the files.
💾 Large Storage Each sandbox includes generous disk space for code, dependencies, and data files.
⚡ Fast I/O Local SSD storage provides fast read/write operations for intensive workloads.
Best Practice: External Persistence
For critical data that needs to survive sandbox deletion:
Download files before terminating sandboxes
Use external storage (S3, GCS, databases) for persistent data
Stream results to external services during execution
Uploading Files
Upload String Content
Write text content directly to a file:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . createSandbox ({ cpu: 2 , mem: 2048 });
// Upload string content to a file
await sandbox . fs . uploadFile ( '/app/config.json' , JSON . stringify ({
name: 'VoidRun' ,
version: '1.0.0' ,
settings: { debug: true }
}));
console . log ( 'File uploaded successfully' );
await sandbox . remove ();
Upload Binary Files
Upload binary data (images, archives, etc.):
import { VoidRun } from '@voidrun/sdk' ;
import * as fs from 'fs' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Upload from local file path (Node.js, Deno, Bun)
await sandbox . fs . uploadFileFromPath (
'/remote/images/logo.png' ,
'./local/logo.png' ,
'image/png'
);
// Or upload from Uint8Array
const binaryData = new Uint8Array ([ 0x89 , 0x50 , 0x4E , 0x47 ]); // PNG header
await sandbox . fs . uploadFileFromSource ( '/remote/data.bin' , binaryData );
// Or upload from Blob
const blob = new Blob ([ 'text content' ], { type: 'text/plain' });
await sandbox . fs . uploadFileFromSource ( '/remote/file.txt' , blob );
Streaming Upload
For large files, use streaming to avoid memory issues:
import { VoidRun } from '@voidrun/sdk' ;
import * as fs from 'fs' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Stream large file from local filesystem
await sandbox . fs . uploadFileFromPath (
'/remote/large-dataset.tar.gz' ,
'./local/large-dataset.tar.gz'
);
// Or use ReadableStream directly
const fileStream = fs . createReadStream ( './large-file.bin' );
const webStream = require ( 'stream' ). Readable . toWeb ( fileStream );
await sandbox . fs . uploadFileStream (
'/remote/large-file.bin' ,
webStream ,
'application/octet-stream'
);
Downloading Files
Download File Content
Retrieve file contents from the sandbox:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Download file as Buffer
const content = await sandbox . fs . downloadFile ( '/app/output.json' );
console . log ( 'File content:' , content . toString ( 'utf-8' ));
// Parse JSON
const data = JSON . parse ( content . toString ( 'utf-8' ));
console . log ( 'Parsed data:' , data );
Streaming Download
For large files, use streaming to process data incrementally:
import { VoidRun } from '@voidrun/sdk' ;
import * as fs from 'fs' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Get streaming ReadableStream
const stream = await sandbox . fs . downloadFileStream ( '/remote/large-log.txt' );
// Process stream chunks
const reader = stream . getReader ();
const chunks : Uint8Array [] = [];
while ( true ) {
const { done , value } = await reader . read ();
if ( done ) break ;
chunks . push ( value );
console . log ( `Received chunk of ${ value . length } bytes` );
}
// Combine chunks
const totalLength = chunks . reduce (( sum , chunk ) => sum + chunk . length , 0 );
const result = new Uint8Array ( totalLength );
let offset = 0 ;
for ( const chunk of chunks ) {
result . set ( chunk , offset );
offset += chunk . length ;
}
console . log ( 'Total bytes:' , result . length );
File Operations
List Files
List files in a directory:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// List files in directory
const result = await sandbox . fs . listFiles ( '/app' );
console . log ( 'Files:' , result . data );
// Output: [{ name: 'config.json', isDir: false, size: 123 }, ...]
Create Files and Directories
Create empty files and directories:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Create empty file
await sandbox . fs . createFile ( '/app/newfile.txt' );
// Create directory (and parent directories)
await sandbox . fs . createDirectory ( '/app/data/logs' );
Copy and Move Files
Copy or move files within the sandbox:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Copy file
await sandbox . fs . copyFile ( '/app/source.txt' , '/app/destination.txt' );
// Move/rename file
await sandbox . fs . moveFile ( '/app/old-name.txt' , '/app/new-name.txt' );
Delete Files
Remove files and directories:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Delete a file
await sandbox . fs . deleteFile ( '/app/obsolete.txt' );
Retrieve file metadata and statistics:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Get file stats
const result = await sandbox . fs . statFile ( '/app/data.json' );
console . log ( 'File info:' , result . info );
// { size: 1024, mode: '0644', mtime: '2024-01-15T10:30:00Z', isDir: false }
Search Files
Find files matching a pattern:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Search for Python files
const result = await sandbox . fs . searchFiles ( '/app' , '*.py' );
console . log ( 'Found files:' , result . paths );
// ['/app/main.py', '/app/utils.py', '/app/tests/test_main.py']
Archive Operations
Compress Files
Create archives of files and directories:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Compress directory to tar.gz
const result = await sandbox . fs . compressFile ( '/app/data' , 'tar.gz' );
console . log ( 'Archive created:' , result . archivePath );
// /app/data.tar.gz
// Also supports: 'tar', 'tar.gz', 'tar.bz2', 'zip'
Extract compressed archives:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Extract archive
await sandbox . fs . extractArchive (
'/app/data.tar.gz' ,
'/app/extracted'
);
File Watching
Monitor file changes in real-time:
import { VoidRun } from '@voidrun/sdk' ;
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
const sandbox = await vr . getSandbox ( 'sandbox-id' );
// Watch for file changes
const watcher = await sandbox . fs . watch ( '/app' , {
recursive: true ,
onEvent : ( event ) => {
console . log ( `File ${ event . type } : ${ event . path } ` );
// event.type: 'create', 'modify', 'delete'
},
onError : ( error ) => {
console . error ( 'Watch error:' , error );
}
});
// Later, stop watching
watcher . close ();
Complete Example: Data Processing Pipeline
Here’s a complete example showing file operations in a data processing workflow:
import { VoidRun } from '@voidrun/sdk' ;
async function processData () {
const vr = new VoidRun ({ apiKey: process . env . VOIDRUN_API_KEY });
// Create sandbox
const sandbox = await vr . createSandbox ({
name: 'data-processor' ,
cpu: 2 ,
mem: 4096
});
try {
// 1. Upload input data
const inputData = JSON . stringify ({
records: [
{ id: 1 , value: 100 },
{ id: 2 , value: 200 },
{ id: 3 , value: 300 }
]
});
await sandbox . fs . uploadFile ( '/data/input.json' , inputData );
console . log ( 'Input data uploaded' );
// 2. Create processing script
const script = `
import json
# Read input
with open('/data/input.json', 'r') as f:
data = json.load(f)
# Process data
total = sum(r['value'] for r in data['records'])
average = total / len(data['records'])
# Write output
output = {
'total': total,
'average': average,
'count': len(data['records'])
}
with open('/data/output.json', 'w') as f:
json.dump(output, f, indent=2)
print(f"Processed {output['count']} records")
` ;
await sandbox . fs . uploadFile ( '/scripts/process.py' , script );
// 3. Run processing
const result = await sandbox . exec ({
command: 'python3 /scripts/process.py'
});
console . log ( 'Processing:' , result . data ?. stdout );
// 4. Download results
const outputData = await sandbox . fs . downloadFile ( '/data/output.json' );
const results = JSON . parse ( outputData . toString ());
console . log ( 'Results:' , results );
// 5. Archive processed data
await sandbox . fs . compressFile ( '/data' , 'tar.gz' );
const archive = await sandbox . fs . downloadFile ( '/data.tar.gz' );
console . log ( 'Archive size:' , archive . length , 'bytes' );
return results ;
} finally {
await sandbox . remove ();
}
}
processData (). catch ( console . error );
Next Steps
Custom Images Create custom Docker images with pre-configured environments.
Package Management Install and manage packages dynamically in sandboxes.