Proxy (a stand-in that controls access)

When to use

  • Add caching, auth, logging, rate limits, or lazy init without changing the real object.
  • You call a slow/remote service (secrets, schemas, feature store) and want read-through cache.
  • You need a guard around a resource (check token/permissions before calls).

Avoid when a plain decorator or functools.lru_cache on a simple function already solves it.

Diagram (text)

Client ──> Secrets (interface)
             ▲
      CachingSecretsProxy  ── calls ──> RemoteSecrets
        (adds cache/TTL)                (does real I/O)

Python example (≤40 lines, type-hinted)

Read-through cache in front of a remote secrets service.

from __future__ import annotations
from dataclasses import dataclass, field
from typing import Protocol, Callable
import time

class Secrets(Protocol):
    def get(self, name: str) -> str: ...

@dataclass
class RemoteSecrets:
    fetch: Callable[[str], str]  # e.g., AWS/GCP API call
    def get(self, name: str) -> str:
        return self.fetch(name)

@dataclass
class CachingSecretsProxy:
    inner: Secrets
    ttl: float | None = 300.0
    _cache: dict[str, tuple[float, str]] = field(default_factory=dict)
    _now: Callable[[], float] = time.time  # injectable for tests
    def get(self, name: str) -> str:
        if name in self._cache:
            ts, val = self._cache[name]
            if self.ttl is None or self._now() - ts < self.ttl:
                return val
        val = self.inner.get(name)
        self._cache[name] = (self._now(), val)
        return val

Tiny pytest (cements it)

def test_proxy_caches_reads():
    calls = {"n": 0}
    def fetch(n): calls["n"] += 1; return "secret-"+n
    proxy = CachingSecretsProxy(RemoteSecrets(fetch), ttl=None)  # never expires
    assert proxy.get("token") == "secret-token"
    assert proxy.get("token") == "secret-token"
    assert calls["n"] == 1  # second read served from cache

Trade-offs & pitfalls

  • Pros: Add cross-cutting behavior without touching the real class; easy to swap/remove; great for tests.
  • Cons: Another layer to reason about; cache invalidation is hard.
  • Pitfalls:
    • Stale data (TTL too long) or thrash (TTL too short).
    • Unbounded cache growth—consider size limits/eviction.
    • Swallowing exceptions—forward or wrap with context; don’t hide failures.
    • Thread/process safety—dict cache is per-process; use a shared cache (Redis) if needed.

Pythonic alternatives

  • functools.lru_cache on a plain function; cachetools.TTLCache for TTL + size.
  • Decorators for logging/metrics; contextlib for lazy init/teardown.
  • Adapters if you need to normalize an interface (Proxy focuses on behavior added to the same interface).

Mini exercise

Add a max_size to CachingSecretsProxy. Evict the oldest cached entry when capacity is exceeded. Write a test that requests 3 keys with max_size=2 and asserts only the 2 most recent remain cached.

Checks (quick checklist)

  • Proxy keeps the same interface as the real thing.
  • Extra behavior is orthogonal (cache, auth, logging, rate limit).
  • Clear policy for TTL/size/eviction and error handling.
  • Tests prove the proxy adds behavior while results stay the same.
  • Easy to bypass (use the real object) when needed.