This article reflects MLNavigator Research Group work. Deployment lives in AdapterOS.

← Back to Research Notes
offline architecture dependencies

Offline AI Without Third-Party Gravity

October 03, 2025

Every modern software stack has gravitational pull toward third-party services. AI systems have it worse.

The Gravity Problem

Consider a typical AI application:

  • Model weights: Hosted on Hugging Face
  • Runtime: Calls home for updates
  • Dependencies: NPM/pip packages with their own dependencies
  • Telemetry: Usage analytics by default
  • Licensing: Periodic license checks

Each of these creates a dependency on external services. Some are obvious (can't download the model without internet). Others are subtle (the library phones home on import).

For air-gapped deployments, every external call is a failure mode.

Mapping the Dependencies

Before going offline, audit every network call:

# Watch for outbound connections during startup
sudo lsof -i -P | grep ESTABLISHED

# Check DNS queries
sudo tcpdump -i any port 53

# Monitor during inference
netstat -an | grep ESTABLISHED

You'll find surprises. Libraries checking for updates. Logging frameworks calling analytics endpoints. License managers pinging auth servers.

Cutting the Strings

Model Weights

Download once, hash, store locally:

# Download
wget https://example.com/model.bin

# Hash for verification
sha256sum model.bin > model.bin.sha256

# Verify on air-gapped system
sha256sum -c model.bin.sha256

Never rely on dynamic downloads.

Dependencies

Vendor everything:

# Python
pip download -r requirements.txt -d ./vendor
pip install --no-index --find-links ./vendor -r requirements.txt

# Node (if you must)
npm pack

Audit vendored packages for network calls.

Runtime Configuration

Disable update checks, telemetry, and analytics:

{
  "updates": { "check": false },
  "telemetry": { "enabled": false },
  "analytics": { "enabled": false }
}

Some libraries ignore these settings. Test with network isolation.

Licensing

Avoid software requiring periodic license validation. If unavoidable:

  • Get offline license terms in writing
  • Test that offline operation works for your deployment period
  • Have a fallback plan

Testing Offline Operation

Real testing means actual isolation:

# macOS: disable all networking
networksetup -setairportpower en0 off
sudo ifconfig en0 down

# Or use a firewall to block all egress
sudo pfctl -e
echo "block out all" | sudo pfctl -f -

Run your full workflow. Every failure reveals a hidden dependency.

The Telemetry Problem

Many AI libraries include telemetry by default. They call it "anonymous usage data" or "crash reporting."

For high-assurance deployments, this is unacceptable:

  1. It reveals that you're using AI (operational security)
  2. It may leak input characteristics (data security)
  3. It creates network dependencies (availability)

Audit library source code for telemetry. Disable it or use libraries that don't include it.

Conclusion

Offline operation isn't just "works without WiFi." It means no external dependencies at runtime, no data exfiltration paths, no phone-home behavior.

Achieving this requires:

  • Complete dependency auditing
  • Vendored and hashed packages
  • Disabled telemetry at every layer
  • Network-isolated testing

The gravitational pull toward third-party services is strong. Resisting it takes deliberate architecture.