← Developer's Journal

Zero-Touch PSP Development: From TCP Commands to Physical Actuators

The PSP has no JTAG, no serial console, and no remote debug interface. We built a complete autonomous development loop — WiFi auto-connect, a TCP command server with remote input injection, live framebuffer streaming, arbitrary file upload, and a DIY linear actuator for hard reboot recovery. The result: a closed loop where an AI agent debugs H.264 video decode on real hardware, iterating dozens of times without anyone touching the device.

The Problem: Manual Testing Doesn’t Scale

Developing for the PSP is an exercise in physical logistics. Every test cycle requires walking to the device, plugging in USB, copying the EBOOT, ejecting, launching the app, connecting to WiFi through a dialog, reproducing the bug, reading logs via USB again, and repeating. Each iteration takes 3–5 minutes of human time, most of it waiting. When you’re chasing a firmware deadlock that manifests after 70 decoded video frames (Entry 07), that adds up fast.

We needed to close the loop — remove the human from the iterative cycle entirely. Build the EBOOT on the host, deploy it to the PSP over WiFi, reboot the device, wait for it to come back online, navigate the UI remotely, capture the screen, and read logs — all from a script or an AI agent. No walking to the desk. No plugging in cables. No pressing buttons.

Layer 1: WiFi Auto-Connect

The PSP’s WiFi stack doesn’t connect automatically on boot. Normally, psp::net::connect_dialog() shows a system dialog that renders via the GU — which blocks the main thread and corrupts the display list if called at the wrong time.

We added a background thread (cmd_srv, priority 40) that spawns during EBOOT init. It waits 5 seconds for the WLAN hardware to initialize after a cold boot, then calls sceNetApctlConnect on saved WiFi profiles (trying profile 1 then 0) without any dialog. If the first attempt fails, it retries once more after the main loop has had time to settle.

Key discovery: psp::net::is_connected() uses an internal flag that doesn’t reflect the actual apctl state. We had to check sceNetApctlGetState directly to get reliable connection status. The is_connected() flag is only set by psp::net::init(), not by raw sceNetApctlConnect calls. This same mismatch caused a GU corruption bug when the TV Guide app launched: it checked is_connected(), got false (despite WiFi being up), called ensure_net_init() which returned immediately, then called reinit_gu_frame() — a function that is only safe after GU utility dialogs. The fix: check sceNetApctlGetState everywhere, not just in the server thread.

Layer 2: TCP Command Server

Once WiFi is up, the background thread starts a TCP server on port 9293. The protocol is intentionally simple: one text command per connection, one response, close. This makes it trivially scriptable with nc (netcat):

$ echo "ping" | nc -w 3 192.168.0.249 9293
pong

The full command set:

CommandResponseDescription
pingpong\nConnectivity check
statusJSONKiosk app, free memory, frame count, audio-only flag
logtextLast 2KB of eboot.log
logfulltextLast 8KB of eboot.log
screencapraw pixels480×272 ABGR framebuffer stream
screenshotok\nSave VRAM to ms0:
press <button>ok\nInject button press+release
hold <button> <ms>ok\nHold button for N milliseconds
cursor <x> <y>ok\nMove cursor to absolute position
deploy <size>ok\nReceive EBOOT binary over TCP
upload <size> <path>ok\nWrite any file to ms0: over TCP
rebootok\nCold hardware reset via scePowerRequestColdReset
exitok\nExit to XMB
audio-only [on|off]statusToggle or set video decode bypass
video-limit <N>statusSet max video frames for >480p content

Layer 3: Remote Deploy + Arbitrary File Upload

The deploy command receives a raw EBOOT binary over the TCP connection. The protocol is minimal: send deploy <size>\n followed by exactly <size> bytes of EBOOT data. The server writes to a temp file first, then renames over the live EBOOT (atomic-ish on FAT32). This avoids corruption if the transfer is interrupted.

The upload command generalizes this to any file path on the Memory Stick. This is critical for updating kernel PRX plugins without USB access:

# Deploy a new EBOOT
$ SIZE=$(stat -c%s "EBOOT.PBP")
$ (echo "deploy $SIZE"; cat "EBOOT.PBP") | nc -w 30 192.168.0.249 9293
ok

# Upload a kernel PRX plugin update
$ SIZE=$(stat -c%s "oasis.prx")
$ (echo "upload $SIZE ms0:/seplugins/oasis.prx"; cat "oasis.prx") | nc -w 30 192.168.0.249 9293
ok

After deploy, reboot triggers scePowerRequestColdReset(0) — a full hardware reboot that reloads all firmware modules, CFW plugins, and the AutoStart EBOOT. This is not sceKernelLoadExec (which only restarts the app) — it’s a real power cycle. The PSP shuts down, reboots, CFW initializes, AutoStart launches OASIS OS, WiFi auto-connects, and the TCP server comes back online. Total time: ~20 seconds.

$ echo "reboot" | nc -w 3 192.168.0.249 9293
ok
$ sleep 20
$ echo "ping" | nc -w 3 192.168.0.249 9293
pong

Layer 4: Remote UI Control

Button presses are injected through a lock-free SpscQueue<InputEvent, 32> shared between the TCP server thread and the main loop. Each frame, poll_events_inner() drains the queue alongside real controller input. From the main loop’s perspective, injected events are indistinguishable from physical button presses.

The cursor command sets absolute cursor position, which is critical for the PSP’s cursor-based dashboard — icon clicks use hit-testing at the cursor position, not d-pad grid navigation. Injected CursorMove events update the backend’s internal cursor_x/cursor_y so that subsequent ButtonPress(Confirm) events hit-test at the right coordinates.

With this, we can script complete UI workflows:

# Navigate to TV Guide (icon at row 2, col 2)
echo "cursor 290 200" | nc -w 3 192.168.0.249 9293
sleep 0.3
echo "press cross" | nc -w 3 192.168.0.249 9293

# Wait for it to load, then toggle windowed mode
sleep 5
echo "press start" | nc -w 3 192.168.0.249 9293

Layer 5: Live Framebuffer Streaming

The screencap command reads the PSP’s VRAM directly at 0x44000000 (uncached framebuffer base), crops from the 512-pixel stride to 480 visible pixels per row, and streams the raw ABGR pixel data over TCP with a simple header (480 272\n).

On the host side, a one-liner converts this to a viewable PNG:

$ echo "screencap" | nc -w 5 192.168.0.249 9293 > /tmp/psp.raw
$ tail -c +9 /tmp/psp.raw | ffmpeg -y -f rawvideo \
    -pixel_format abgr -video_size 480x272 \
    -i - -update 1 /tmp/psp.png
TV Guide in windowed mode, captured remotely via TCP screencap command
TV Guide in windowed mode — captured remotely via the screencap TCP command and converted to PNG with ffmpeg.
OASIS OS dashboard captured remotely
The OASIS OS dashboard with all 11 app icons — captured entirely via remote automation.

Layer 6: Physical Hard Reset (The Last Mile)

Software reboot handles the normal case. But the PSP’s firmware occasionally hard-locks — the ME deadlock from Entry 07, GU command buffer corruption, or OOM crashes that hang the kernel. When this happens, the TCP server is unreachable and the only recovery is a physical power cycle: hold the power slider for several seconds.

The PSP has no Wake-on-LAN, no IPMI, no remote management interface. The power switch is a physical slider on the side of the device. To close the automation loop completely, we designed a hardware solution: a 12V linear actuator controlled by a USB relay module, driven by a Python script on the development PC.

How It Works

A small linear actuator (10mm stroke, 188N force) pushes the PSP’s power slider. A 2-channel USB relay module (LCUS-2 with CH340 serial chip) is wired as an H-bridge to control the actuator’s direction:

H-bridge wiring diagram for PSP power actuator
The USB relay H-bridge wiring. Two relays cross-wired to reverse polarity for extend/retract control.
Interactive 3D blueprint of the PSP relay H-bridge wiring
▶ Open interactive 3D blueprint

The Python control script sends serial commands to the relay module:

$ python psp_actuator.py reboot
--- PSP HARD REBOOT ---
Phase 1: Power OFF (3/4 extend, 10s hold)
Holding 10 seconds...
Releasing slider...
Phase 2: Power ON (quick tap)
Done! PSP should be booting.

Parts List

PartApprox. Price
12V mini linear actuator (10mm stroke)~$15
2-channel USB relay module (LCUS-2, CH340)~$12
12V 2A DC power adapter~$8
Wago 221 lever connectors~$8–10
22 AWG hookup wire~$6
Wire stripper~$10–15

Total cost: ~$55–65, all off-the-shelf, no soldering required.

Layer 7: Network Recovery (The Safety Net)

The actuator handles hard locks, but there’s a worse scenario: a bad EBOOT that crashes on boot. If the main application won’t start, the TCP server never comes up, and there’s no way to deploy a fix — even the actuator just power-cycles into the same crash.

ARK-4 CFW has a recovery mode: hold the R-trigger during boot and it loads a separate recovery application from ms0:/PSP/SAVEDATA/ARK_01234/RECOVERY.PBP instead of the normal boot path. All game plugins are disabled. We replaced ARK’s default recovery menu with a 154 KB Rust EBOOT that does exactly one thing: connect to WiFi and accept file uploads.

$ echo "ping" | nc -w 3 192.168.0.249 9293
pong
$ echo "status" | nc -w 3 192.168.0.249 9293
{"mode":"recovery","free_kb":23681,"max_blk_kb":23454,"wifi":true}

The recovery EBOOT uses the same TCP protocol as the main application — upload, reboot, ping, status. A bricked OASIS OS can be fixed by uploading a new EBOOT from the recovery server, then rebooting:

# PSP is in recovery mode (R-trigger held during boot)
$ SIZE=$(stat -c%s "EBOOT.PBP")
$ (echo "upload $SIZE ms0:/PSP/GAME/OASISOS/EBOOT.PBP"; cat "EBOOT.PBP") | nc -w 30 192.168.0.249 9293
ok
$ echo "reboot" | nc -w 3 192.168.0.249 9293
ok
# PSP reboots normally with fixed EBOOT

With a second actuator on the R-trigger, the full recovery sequence is automatable: power on with R-trigger held → recovery WiFi connects → upload fixed files → reboot normally. The PSP becomes unbrickable over WiFi.

Verified: uploaded a 4.8 MB EBOOT through the recovery server, rebooted, and OASIS OS came up normally — the full round trip from recovery to working application in under 30 seconds.

The Complete Automation Loop

With all seven layers in place, the full development cycle looks like this:

  1. Build the EBOOT on the host (cargo psp --release)
  2. Deploy over WiFi via deploy or upload (~5 seconds for 4.8MB)
  3. Reboot via reboot TCP command (cold hardware reset)
  4. Wait ~20 seconds for boot + AutoStart + WiFi auto-connect
  5. Verify via ping, status, screencap
  6. Test via cursor, press commands to navigate UI
  7. Diagnose via log, logfull, screencap
  8. Recover from hard locks via the physical actuator
  9. Unbrick via network recovery if the EBOOT is broken (R-trigger boot + upload fix)

The unified toolkit scripts/psp_remote.py wraps all of this into simple commands, with retry logic, CRC32 deploy verification, and built-in PNG screencap (no ffmpeg dependency):

# Full cycle: build, deploy, reboot, wait, verify
$ python3 scripts/psp_remote.py cycle target/.../EBOOT.PBP

# Or build + deploy + reboot in one step
$ python3 scripts/psp_remote.py build-cycle

# Remote UI navigation
$ python3 scripts/psp_remote.py cursor 290 200
$ python3 scripts/psp_remote.py press cross

# Capture and view screen (pure-Python PNG, no ffmpeg needed)
$ python3 scripts/psp_remote.py screencap /tmp/psp.png

# Upload kernel plugin over WiFi
$ python3 scripts/psp_remote.py upload oasis.prx ms0:/seplugins/oasis.prx

# Check device state
$ python3 scripts/psp_remote.py status
{"kiosk":"tv_guide","free_kb":14230,"max_blk_kb":14100,"frame":8520,"audio_only":false}

# Run a repeatable test sequence from YAML
$ python3 scripts/psp_remote.py sequence scripts/sequences/smoke-test.yaml

# Hard reboot via USB relay actuator when PSP is frozen
$ python3 scripts/psp_remote.py hard-reboot

Proving It: Debugging H.264 Video Decode Remotely

The automation infrastructure was built specifically to solve a hard problem: getting hardware H.264 video decode working on the PSP’s Media Engine. The ME deadlocks after ~70 frames on >480p content (Entry 07), and each debugging iteration required deploying new code, rebooting, navigating to the TV Guide app, tuning a channel, waiting for the deadlock, reading logs, and trying the next fix.

With the automation in place, the entire video decode pipeline was debugged remotely:

Each iteration was: code change → build → TCP deploy → cold reboot → remote navigate to TV Guide → tune channel → screencap to verify → read logs. Average cycle time: under 90 seconds. Over 30 iterations in a single session, zero human interaction with the device.

H.264 video decoded on PSP Media Engine hardware, captured remotely
H.264 video from the PSP Media Engine — decoded at 656×480 on real hardware, captured via remote screencap. The content (Bits and Bytes, 1983) streams from archive.org over TLS 1.3.

Closing the Loop: AI-Agent-Driven Hardware Development

Full closed-loop demo: an AI agent connects to the PSP over WiFi, launches TV Guide, tunes a channel, windows the app, then power-cycles the device via a USB relay-controlled linear actuator — all without touching the hardware.

What makes this infrastructure unusual is not the individual pieces — TCP servers and remote deployment are standard practice. It’s that the entire system was designed to close the loop for an AI coding agent, removing the human from the edit-compile-deploy-test cycle on physical hardware.

An AI agent (Claude) built the TCP server, deployed it to the PSP, navigated the UI by calculating icon pixel coordinates and injecting cursor+click events, captured screenshots to visually verify rendering, read logs to diagnose bugs, and iterated on fixes — all through the same tool interface it uses for reading files and running shell commands. The PSP appeared to the agent as just another development target, despite being a 2004 handheld with no standard debug interface.

The closed loop is what made the video decode breakthrough possible. Debugging the ME deadlock required testing dozens of hypotheses — cache flush timing, pixel format combinations, stride calculations, kernel hook configurations — each requiring a full deploy-reboot-test cycle. With a 90-second loop and no human in it, the agent could run 30+ experiments in a single session. A manual workflow at 5 minutes per cycle would have taken an entire day.

This pattern generalizes beyond the PSP. Any embedded device with a network connection and a text command interface becomes AI-debuggable. The hard part is not the AI; it’s building the infrastructure that bridges the gap between “run a shell command” and “press a button on a handheld.” Once that bridge exists, the loop is closed, and iteration speed is limited only by compile time and boot time — not by human availability.

Lessons Learned